论文标题

推荐变压器具有行为途径

Recommender Transformers with Behavior Pathways

论文作者

Yao, Zhiyu, Chen, Xinyang, Wang, Sinan, Dai, Qinyan, Li, Yumeng, Zhu, Tanchao, Long, Mingsheng

论文摘要

顺序建议需要推荐剂来从已记录的用户行为数据中捕获不断发展的行为特征,以进行准确的建议。但是,用户行为序列被视为脚本,具有多个正在进行的线程交织在一起的脚本。我们发现,只有一小部分关键行为才能发展为用户的未来动作。结果,用户的未来行为很难预测。我们将每个用户作为行为途径的顺序行为的特征结论。不同的用户具有独特的行为途径。在现有的顺序模型中,变压器在捕获全球依赖性特征方面表现出很大的能力。但是,这些模型主要使用自我注意的机制在所有以前的行为上提供了密集的分布,这使得最终预测被未调整给每个用户的微不足道行为所淹没。在本文中,我们使用一种新型的途径注意机制构建了推荐变压器(RETR)。 REOR可以动态地计划为每个用户指定的行为途径,并通过此行为途径很少激活网络,以有效捕获对推荐有用的演变模式。关键设计是一种学识渊博的二进制路线,以防止行为途径被微不足道的行为淹没。我们从经验上验证了RETR在七个现实世界数据集中的有效性,而recor产生了最先进的性能。

Sequential recommendation requires the recommender to capture the evolving behavior characteristics from logged user behavior data for accurate recommendations. However, user behavior sequences are viewed as a script with multiple ongoing threads intertwined. We find that only a small set of pivotal behaviors can be evolved into the user's future action. As a result, the future behavior of the user is hard to predict. We conclude this characteristic for sequential behaviors of each user as the Behavior Pathway. Different users have their unique behavior pathways. Among existing sequential models, transformers have shown great capacity in capturing global-dependent characteristics. However, these models mainly provide a dense distribution over all previous behaviors using the self-attention mechanism, making the final predictions overwhelmed by the trivial behaviors not adjusted to each user. In this paper, we build the Recommender Transformer (RETR) with a novel Pathway Attention mechanism. RETR can dynamically plan the behavior pathway specified for each user, and sparingly activate the network through this behavior pathway to effectively capture evolving patterns useful for recommendation. The key design is a learned binary route to prevent the behavior pathway from being overwhelmed by trivial behaviors. We empirically verify the effectiveness of RETR on seven real-world datasets and RETR yields state-of-the-art performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源