论文标题

TransQuest:跨语言变压器的翻译质量估计

TransQuest: Translation Quality Estimation with Cross-lingual Transformers

论文作者

Ranasinghe, Tharindu, Orasan, Constantin, Mitkov, Ruslan

论文摘要

近年来,句子级质量估计(QE)领域的进步很大,这在很大程度上是由于使用基于神经的架构。但是,这些方法中的大多数仅适用于他们经过培训的语言对,并且需要对新语言对进行再培训。从技术角度来看,这个过程可能很困难,并且通常在计算上很昂贵。在本文中,我们提出了一个基于跨语义变压器的简单量化宽松框架,并使用它来实现和评估两个不同的神经体系结构。我们的评估表明,在从WMT的数据集中训练时,提出的方法达到的最新结果优于当前开源质量估计框架。此外,该框架在转移学习设置中非常有用,尤其是在处理低资源的语言时,使我们能够获得非常有竞争力的结果。

Recent years have seen big advances in the field of sentence-level quality estimation (QE), largely as a result of using neural-based architectures. However, the majority of these methods work only on the language pair they are trained on and need retraining for new language pairs. This process can prove difficult from a technical point of view and is usually computationally expensive. In this paper we propose a simple QE framework based on cross-lingual transformers, and we use it to implement and evaluate two different neural architectures. Our evaluation shows that the proposed methods achieve state-of-the-art results outperforming current open-source quality estimation frameworks when trained on datasets from WMT. In addition, the framework proves very useful in transfer learning settings, especially when dealing with low-resourced languages, allowing us to obtain very competitive results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源