论文标题

文本编辑中的经常推断

Recurrent Inference in Text Editing

论文作者

Shi, Ning, Zeng, Ziheng, Zhang, Haotian, Gong, Yichen

论文摘要

在神经文本编辑中,基于序列到序列的普遍方法将未编辑的文本直接映射到编辑的文本或编辑操作中,其中性能会被有限的源文本编码和长而变化的解码步骤降低。为了解决这个问题,我们提出了一种新的推理方法,即迭代地执行编辑操作,从而大大缩小问题空间。在每次迭代中,编码部分编辑的文本,复发解码潜在表示,生成短而固定的操作,并应用操作来完成单个编辑。为了进行全面比较,我们介绍了三种类型的文本编辑任务:算术运算符恢复(AOR),算术方程简化(AES),算术方程校正(AEC)。对这些任务的广泛实验,这些任务具有不同的困难,表明复发可以改善常规推理方法。

In neural text editing, prevalent sequence-to-sequence based approaches directly map the unedited text either to the edited text or the editing operations, in which the performance is degraded by the limited source text encoding and long, varying decoding steps. To address this problem, we propose a new inference method, Recurrence, that iteratively performs editing actions, significantly narrowing the problem space. In each iteration, encoding the partially edited text, Recurrence decodes the latent representation, generates an action of short, fixed-length, and applies the action to complete a single edit. For a comprehensive comparison, we introduce three types of text editing tasks: Arithmetic Operators Restoration (AOR), Arithmetic Equation Simplification (AES), Arithmetic Equation Correction (AEC). Extensive experiments on these tasks with varying difficulties demonstrate that Recurrence achieves improvements over conventional inference methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源