论文标题

神经多跳问题产生的更强的变压器

Stronger Transformers for Neural Multi-Hop Question Generation

论文作者

Sachan, Devendra Singh, Wu, Lingfei, Sachan, Mrinmaya, Hamilton, William

论文摘要

自动化问题生成的先前工作几乎完全专注于生成简单的问题,这些问题可以从单个文档中提取。但是,对开发能够产生更复杂的多跳问题生成的系统的兴趣越来越大,回答问题需要对多个文档进行推理。在这项工作中,我们介绍了一系列强大的变压器模型,用于多跳问题生成,其中包括一个利用文本中实体之间关系的图形增强变压器。尽管先前的工作强调了基于图的模型的重要性,但我们表明,使用标准变压器体系结构,我们可以大大优于5个BLEU点的最先进。我们进一步证明,基于图的增强可以为基础提供免费改进。有趣的是,我们发现几个重要因素 - 例如包含辅助对比目标和数据过滤可能会对性能产生更大的影响。我们希望我们的强大基线和分析为这一领域的未来工作提供了建设性的基础。

Prior work on automated question generation has almost exclusively focused on generating simple questions whose answers can be extracted from a single document. However, there is an increasing interest in developing systems that are capable of more complex multi-hop question generation, where answering the questions requires reasoning over multiple documents. In this work, we introduce a series of strong transformer models for multi-hop question generation, including a graph-augmented transformer that leverages relations between entities in the text. While prior work has emphasized the importance of graph-based models, we show that we can substantially outperform the state-of-the-art by 5 BLEU points using a standard transformer architecture. We further demonstrate that graph-based augmentations can provide complimentary improvements on top of this foundation. Interestingly, we find that several important factors--such as the inclusion of an auxiliary contrastive objective and data filtering could have larger impacts on performance. We hope that our stronger baselines and analysis provide a constructive foundation for future work in this area.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源