论文标题

通过查询值相互作用改善注意力机制

Improving Attention Mechanism with Query-Value Interaction

论文作者

Wu, Chuhan, Wu, Fangzhao, Qi, Tao, Huang, Yongfeng

论文摘要

注意机制在各种最新的NLP模型(例如变压器和BERT)中起着关键作用。它可以作为三元函数配方,该函数将输入查询,键和值映射到输出中,通过使用从查询和键之间的相互作用得出的值加权的值求和。与查询相互作用相似,查询和值之间也存在固有的相关性,并且合并查询值交互的可能性可能通过根据查询的特征来学习定制的值来增强输出。但是,现有的注意方法忽略了查询值交互,这可能不是最佳的。在本文中,我们建议通过结合查询值相互作用来改善现有的注意机制。我们提出了一个查询值交互函数,该功能可以学习查询意识到的注意值,并将其与原始值和注意力权重相结合以形成最终输出。在四个数据集上进行不同任务的大量实验表明,我们的方法可以通过结合查询值交互来始终如一地提高许多基于注意力的模型的性能。

Attention mechanism has played critical roles in various state-of-the-art NLP models such as Transformer and BERT. It can be formulated as a ternary function that maps the input queries, keys and values into an output by using a summation of values weighted by the attention weights derived from the interactions between queries and keys. Similar with query-key interactions, there is also inherent relatedness between queries and values, and incorporating query-value interactions has the potential to enhance the output by learning customized values according to the characteristics of queries. However, the query-value interactions are ignored by existing attention methods, which may be not optimal. In this paper, we propose to improve the existing attention mechanism by incorporating query-value interactions. We propose a query-value interaction function which can learn query-aware attention values, and combine them with the original values and attention weights to form the final output. Extensive experiments on four datasets for different tasks show that our approach can consistently improve the performance of many attention-based models by incorporating query-value interactions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源