论文标题

低资源多语言神经机器翻译的语言家庭适配器

Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation

论文作者

Chronopoulou, Alexandra, Stojanovski, Dario, Fraser, Alexander

论文摘要

经过自学训练的大型多语言模型可以实现最新的自然语言处理任务。自我监督预审计的模型通常会通过一种或多种语言对的平行数据进行微调,以进行机器翻译。多语言微调可提高低资源语言的性能,但需要修改整个模型,并且可能非常昂贵。在每个语言对上训练一个新的适配器或在所有语言对上训练单个适配器,而无需更新预审计的模型,已提议作为参数有效的替代方案。但是,前者不允许在语言之间进行任何共享,而后者共享所有语言的参数,并且容易受到负面干扰。在本文中,我们建议在MBART-50之上培训语言家庭适配器,以促进跨语性转移。我们的方法的表现优于相关的基线,从英语翻译成17种不同的低资源语言时,平均得分更高。我们还表明,语言家庭适配器提供了一种有效的方法,可以在训练过程中转化为看不见的语言。

Large multilingual models trained with self-supervision achieve state-of-the-art results in a wide range of natural language processing tasks. Self-supervised pretrained models are often fine-tuned on parallel data from one or multiple language pairs for machine translation. Multilingual fine-tuning improves performance on low-resource languages but requires modifying the entire model and can be prohibitively expensive. Training a new adapter on each language pair or training a single adapter on all language pairs without updating the pretrained model has been proposed as a parameter-efficient alternative. However, the former does not permit any sharing between languages, while the latter shares parameters for all languages and is susceptible to negative interference. In this paper, we propose training language-family adapters on top of mBART-50 to facilitate cross-lingual transfer. Our approach outperforms related baselines, yielding higher translation scores on average when translating from English to 17 different low-resource languages. We also show that language-family adapters provide an effective method to translate to languages unseen during pretraining.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源