论文标题
基于Quaternion的自我牵键长期短期用户偏好编码用于建议
Quaternion-Based Self-Attentive Long Short-Term User Preference Encoding for Recommendation
论文作者
论文摘要
四元空间给传统的欧几里得空间带来了一些好处:四元素(i)由真实和三个虚构的组成部分组成,鼓励更富有的代表; (ii)利用汉密尔顿产品,该产品更好地编码了多个四季度成分之间的纬度相互作用; (iii)导致一个模型具有较小的自由度,并且易于过度拟合。不幸的是,当前的大多数推荐系统都依靠欧几里得空间中的实用值表示,以建模用户的长期或短期兴趣。在本文中,我们充分利用了四个空间来对用户的长期和短期偏好进行建模。我们首先提出了一个基于四季度的自我牵键长期用户编码(Quale),以研究用户的长期意图。然后,我们提出了一个基于四季度的自动短期用户编码(QUASE),以了解用户的短期兴趣。为了增强我们的模型的能力,我们建议通过使用基于季节的门控机制将Quale和Quase融合为一个模型,即质量。我们进一步开发了基于四元的对抗性学习以及贝叶斯个性化排名(QABPR),以提高模型的鲁棒性。在六个现实世界数据集上进行的大量实验表明,我们融合的质量模型的表现优于11个最先进的基线,在@1@1的命中率为8.43%,而NDCG@1的10.27%平均在1.27%上提高了1.27%,而不是最佳基线。
Quaternion space has brought several benefits over the traditional Euclidean space: Quaternions (i) consist of a real and three imaginary components, encouraging richer representations; (ii) utilize Hamilton product which better encodes the inter-latent interactions across multiple Quaternion components; and (iii) result in a model with smaller degrees of freedom and less prone to overfitting. Unfortunately, most of the current recommender systems rely on real-valued representations in Euclidean space to model either user's long-term or short-term interests. In this paper, we fully utilize Quaternion space to model both user's long-term and short-term preferences. We first propose a QUaternion-based self-Attentive Long term user Encoding (QUALE) to study the user's long-term intents. Then, we propose a QUaternion-based self-Attentive Short term user Encoding (QUASE) to learn the user's short-term interests. To enhance our models' capability, we propose to fuse QUALE and QUASE into one model, namely QUALSE, by using a Quaternion-based gating mechanism. We further develop Quaternion-based Adversarial learning along with the Bayesian Personalized Ranking (QABPR) to improve our model's robustness. Extensive experiments on six real-world datasets show that our fused QUALSE model outperformed 11 state-of-the-art baselines, improving 8.43% at HIT@1 and 10.27% at NDCG@1 on average compared with the best baseline.