论文标题

比较总体变压器,以理解文档的对话

A Compare Aggregate Transformer for Understanding Document-grounded Dialogue

论文作者

Ma, Longxuan, Zhang, Weinan, Sun, Runxin, Liu, Ting

论文摘要

作为对话的外部知识的非结构化文档有助于产生更多信息的响应。先前的研究重点是与对话中的文档中的知识选择(KS)。但是,与当前对话无关的对话历史可能会在KS处理中引入噪音。在本文中,我们提出了一个比较汇总变压器(CAT),以共同确定对话环境并汇总文档信息以生成响应。我们设计了两种不同的比较机制来减少噪声(解码之前和期间)。此外,我们提出了两个指标,用于评估基于单词重叠的文档利用效率。 CMUDOG数据集的实验结果表明,所提出的CAT模型的表现优于最先进的方法和强大的基准。

Unstructured documents serving as external knowledge of the dialogues help to generate more informative responses. Previous research focused on knowledge selection (KS) in the document with dialogue. However, dialogue history that is not related to the current dialogue may introduce noise in the KS processing. In this paper, we propose a Compare Aggregate Transformer (CAT) to jointly denoise the dialogue context and aggregate the document information for response generation. We designed two different comparison mechanisms to reduce noise (before and during decoding). In addition, we propose two metrics for evaluating document utilization efficiency based on word overlap. Experimental results on the CMUDoG dataset show that the proposed CAT model outperforms the state-of-the-art approach and strong baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源