论文标题

强大的对话说法重写为序列标记

Robust Dialogue Utterance Rewriting as Sequence Tagging

论文作者

Hao, Jie, Song, Linfeng, Wang, Liwei, Xu, Kun, Tu, Zhaopeng, Yu, Dong

论文摘要

对话重写的任务旨在通过复制对话环境中的缺失内容来重建最新的对话话语。到目前为止,此任务的现有模型遭受了鲁棒性问题的困扰,即在另一个域进行测试时,性能会急剧下降。我们通过提出一个基于新型序列的模型来解决这个鲁棒性问题,以便大大降低了搜索空间,但该任务的核心仍然涵盖了。作为大多数文本生成标记模型的常见问题,该模型的输出可能缺乏流利性。为了减轻此问题,我们在增强框架下注入BLEU或GPT-2的损失信号。实验显示了我们模型对域传输的当前最新系统的巨大改进。

The task of dialogue rewriting aims to reconstruct the latest dialogue utterance by copying the missing content from the dialogue context. Until now, the existing models for this task suffer from the robustness issue, i.e., performances drop dramatically when testing on a different domain. We address this robustness issue by proposing a novel sequence-tagging-based model so that the search space is significantly reduced, yet the core of this task is still well covered. As a common issue of most tagging models for text generation, the model's outputs may lack fluency. To alleviate this issue, we inject the loss signal from BLEU or GPT-2 under a REINFORCE framework. Experiments show huge improvements of our model over the current state-of-the-art systems on domain transfer.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源