论文标题

强大的对话人工智能与基础文本生成

Robust Conversational AI with Grounded Text Generation

论文作者

Gao, Jianfeng, Peng, Baolin, Li, Chunyuan, Li, Jinchao, Shayandeh, Shahin, Liden, Lars, Shum, Heung-Yeung

论文摘要

本文基于基于基础文本生成(GTG)模型的混合方法,以大规模构建强大的任务机器人。 GTG是一种混合模型,它使用大规模变压器神经网络作为骨干,再加上用于知识基础推理和先验知识编码的符号操作模块,以生成以对话信念状态和现实世界知识进行任务完成的响应。 GTG已在大量的原始文本和人类对话数据上进行了预先培训,并且可以进行微调以完成各种任务。 多个研究团队同时开发了混合方法及其变体。以任务为导向的对话基准报告的主要结果非常有前途,这表明了这种方法的巨大潜力。本文概述了这一进展,并讨论了可以合并用于构建强大的会话AI系统的相关方法和技术。

This article presents a hybrid approach based on a Grounded Text Generation (GTG) model to building robust task bots at scale. GTG is a hybrid model which uses a large-scale Transformer neural network as its backbone, combined with symbol-manipulation modules for knowledge base inference and prior knowledge encoding, to generate responses grounded in dialog belief state and real-world knowledge for task completion. GTG is pre-trained on large amounts of raw text and human conversational data, and can be fine-tuned to complete a wide range of tasks. The hybrid approach and its variants are being developed simultaneously by multiple research teams. The primary results reported on task-oriented dialog benchmarks are very promising, demonstrating the big potential of this approach. This article provides an overview of this progress and discusses related methods and technologies that can be incorporated for building robust conversational AI systems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源