论文标题

联合插槽填充和意图检测的共同交互变压器

A Co-Interactive Transformer for Joint Slot Filling and Intent Detection

论文作者

Qin, Libo, Liu, Tailu, Che, Wanxiang, Kang, Bingbing, Zhao, Sendong, Liu, Ting

论文摘要

意图检测和插槽填充是建立口语理解(SLU)系统的两个主要任务。这两个任务密切相关,可以在另一个任务中使用一个任务的信息。先前的研究要么分别对这两个任务进行建模,要么仅考虑从意图到插槽的单个信息流。没有一个先前的方法同时建模两个任务之间的双向连接。在本文中,我们提出了一个共同交互式变压器,以考虑这两个任务之间的交叉影响。我们建议通过在两个相关任务之间建立双向连接来考虑交互模块,而不是在香草变压器中采用自我发项机制。另外,可以将所提出的共同交互模块堆叠以逐步增强彼此的相互特征。两个公共数据集(SNIPS和ATIS)上的实验结果表明,我们的模型可实现最先进的性能,并取得了相当大的改进( +3.4%和 +0.9%的总体ACC)。广泛的实验在经验上验证了我们的模型是否成功捕获了相互作用知识。

Intent detection and slot filling are two main tasks for building a spoken language understanding (SLU) system. The two tasks are closely related and the information of one task can be utilized in the other task. Previous studies either model the two tasks separately or only consider the single information flow from intent to slot. None of the prior approaches model the bidirectional connection between the two tasks simultaneously. In this paper, we propose a Co-Interactive Transformer to consider the cross-impact between the two tasks. Instead of adopting the self-attention mechanism in vanilla Transformer, we propose a co-interactive module to consider the cross-impact by building a bidirectional connection between the two related tasks. In addition, the proposed co-interactive module can be stacked to incrementally enhance each other with mutual features. The experimental results on two public datasets (SNIPS and ATIS) show that our model achieves the state-of-the-art performance with considerable improvements (+3.4% and +0.9% on overall acc). Extensive experiments empirically verify that our model successfully captures the mutual interaction knowledge.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源