论文标题

Levenshtein Transformer用词汇约束的神经机器翻译

Lexically Constrained Neural Machine Translation with Levenshtein Transformer

论文作者

Susanto, Raymond Hendy, Chollampatt, Shamil, Tan, Liling

论文摘要

本文提出了一种简单有效的算法,用于在神经机器翻译中纳入词汇约束。先前的工作要么需要重新训练现有模型,要么使用词汇约束,或者在梁搜索解码过程中以明显较高的计算开销进行了合并。利用最近提出的Levenshtein Transformer模型的灵活性和速度(Gu等,2019),我们的方法在推理时注射术语限制,而不会对解码速度产生任何影响。我们的方法不需要对培训程序进行任何修改,并且可以在运行时轻松地使用自定义字典应用。英语wmt数据集的实验表明,我们的方法改善了不受约束的基线和以前的方法。

This paper proposes a simple and effective algorithm for incorporating lexical constraints in neural machine translation. Previous work either required re-training existing models with the lexical constraints or incorporating them during beam search decoding with significantly higher computational overheads. Leveraging the flexibility and speed of a recently proposed Levenshtein Transformer model (Gu et al., 2019), our method injects terminology constraints at inference time without any impact on decoding speed. Our method does not require any modification to the training procedure and can be easily applied at runtime with custom dictionaries. Experiments on English-German WMT datasets show that our approach improves an unconstrained baseline and previous approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源