论文标题

通过联合双重学习迈向无监督的语言理解和产生

Towards Unsupervised Language Understanding and Generation by Joint Dual Learning

论文作者

Su, Shang-Yu, Huang, Chao-Wei, Chen, Yun-Nung

论文摘要

在模块化对话系统中,自然语言理解(NLU)和自然语言产生(NLG)是两个关键组成部分,其中NLU从给定的文本中提取语义,而NLG则是基于输入语义表示的构建相应的自然语言句子。但是,很少探索理解与发电之间的双重属性。先前的工作是第一次尝试利用NLU和NLG之间的双重性来通过双重监督学习框架提高性能。但是,先前的工作仍然以监督的方式学习了这两个组成部分,相反,本文介绍了一个一般的学习框架,以有效利用这种二元性,提供了将监督和无监督和无监督的学习算法纳入语言理解和生成模型的灵活性。基准实验表明,所提出的方法能够提高NLU和NLG的性能。

In modular dialogue systems, natural language understanding (NLU) and natural language generation (NLG) are two critical components, where NLU extracts the semantics from the given texts and NLG is to construct corresponding natural language sentences based on the input semantic representations. However, the dual property between understanding and generation has been rarely explored. The prior work is the first attempt that utilized the duality between NLU and NLG to improve the performance via a dual supervised learning framework. However, the prior work still learned both components in a supervised manner, instead, this paper introduces a general learning framework to effectively exploit such duality, providing flexibility of incorporating both supervised and unsupervised learning algorithms to train language understanding and generation models in a joint fashion. The benchmark experiments demonstrate that the proposed approach is capable of boosting the performance of both NLU and NLG.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源