论文标题

神经机器翻译与对比度翻译记忆

Neural Machine Translation with Contrastive Translation Memories

论文作者

Cheng, Xin, Gao, Shen, Liu, Lemao, Zhao, Dongyan, Yan, Rui

论文摘要

在许多翻译方案中,检索授课的神经机器翻译模型已经成功。与以前使用相互相似但冗余的翻译记忆〜(TMS)的作品不同,我们提出了一个新的检索效果NMT,以模拟对比检索的翻译记忆,这些记忆与源句子在整体上相似,同时与彼此单独对比,以相互对比,从而在三个阶段提供了最大信息。首先,在TM检索阶段,我们采用了一种对比检索算法,以避免类似翻译作品的冗余和不信息。其次,在记忆编码阶段中,给定一组TMS,我们提出了一个新颖的分层群体注意模块,以收集每个TM的局部上下文和整个TM集的全局上下文。最后,在训练阶段,引入了一个多TM对比学习目标,以学习每个TM的显着特征,相对于目标句子。实验结果表明,我们的框架对基准数据集的强基线进行了改进。

Retrieval-augmented Neural Machine Translation models have been successful in many translation scenarios. Different from previous works that make use of mutually similar but redundant translation memories~(TMs), we propose a new retrieval-augmented NMT to model contrastively retrieved translation memories that are holistically similar to the source sentence while individually contrastive to each other providing maximal information gains in three phases. First, in TM retrieval phase, we adopt a contrastive retrieval algorithm to avoid redundancy and uninformativeness of similar translation pieces. Second, in memory encoding stage, given a set of TMs we propose a novel Hierarchical Group Attention module to gather both local context of each TM and global context of the whole TM set. Finally, in training phase, a Multi-TM contrastive learning objective is introduced to learn salient feature of each TM with respect to target sentence. Experimental results show that our framework obtains improvements over strong baselines on the benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源