论文标题

在复发神经网络中基于变异推理的辍学,用于填充口语的插槽

Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

论文作者

Qi, Jun, Liu, Xu, Tejedor, Javier

论文摘要

本文提出,将差异性神经网络(RNN)推广到具有长期短期记忆(LSTM)细胞(LSTM)细胞的基于变异推理(VI)的辍学式正则化,以供更先进的RNN体系结构,例如门控复发单元(GRU)和双向LSTM/GRU。新的变性RNN用于插槽填充,这是口语理解中的一项有趣但艰巨的任务。 ATIS数据集上的实验表明,具有基于VI的辍学的变异RNN可以显着改善基于f-Measure的基于RNN的幼稚辍学基线基线系统。特别是,具有双向LSTM/GRU的变异RNN获得了最佳的F量评分。

This paper proposes to generalize the variational recurrent neural network (RNN) with variational inference (VI)-based dropout regularization employed for the long short-term memory (LSTM) cells to more advanced RNN architectures like gated recurrent unit (GRU) and bi-directional LSTM/GRU. The new variational RNNs are employed for slot filling, which is an intriguing but challenging task in spoken language understanding. The experiments on the ATIS dataset suggest that the variational RNNs with the VI-based dropout regularization can significantly improve the naive dropout regularization RNNs-based baseline systems in terms of F-measure. Particularly, the variational RNN with bi-directional LSTM/GRU obtains the best F-measure score.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源