论文标题
通过微调变压器模型同时释义和翻译
Simultaneous paraphrasing and translation by fine-tuning Transformer models
论文作者
论文摘要
本文介绍了在ACL 2020的第四届神经产生和翻译(WNGT)的第四个讲习班上同时翻译和释义语言教育的共同任务的第三名。最终系统利用预训练的翻译模型,并使用变形金刚结构结合了实现竞争力性能的交换器架构。该系统的表现明显优于匈牙利人的基线(加权宏F1得分的绝对提高27%)和葡萄牙语(绝对改善)语言的基线。
This paper describes the third place submission to the shared task on simultaneous translation and paraphrasing for language education at the 4th workshop on Neural Generation and Translation (WNGT) for ACL 2020. The final system leverages pre-trained translation models and uses a Transformer architecture combined with an oversampling strategy to achieve a competitive performance. This system significantly outperforms the baseline on Hungarian (27% absolute improvement in Weighted Macro F1 score) and Portuguese (33% absolute improvement) languages.