论文标题

通过双重复发神经网络学习各种长度依赖性

Learning Various Length Dependence by Dual Recurrent Neural Networks

论文作者

Zhang, Chenpeng, Li, Shuai, Ye, Mao, Zhu, Ce, Li, Xue

论文摘要

复发性神经网络(RNN)被广泛用作序列相关问题的存储模型。已经提出了许多RNN的变体来解决训练RNN的梯度问题和过程长序列。尽管已经提出了一些经典模型,但在响应短期变化的同时捕获长期依赖仍然是一个挑战。对于这个问题,我们提出了一个名为双重复发神经网络(Durnn)的新模型。 Durnn由两个部分学习短期依赖并逐步学习长期依赖。第一部分是一个复发性的神经网络,具有限制的完整复发连接,以处理序列的短期依赖性并生成短期记忆。另一部分是具有独立复发连接的复发性神经网络,有助于学习长期依赖并产生长期记忆。在两个部分之间添加了选择机制,以帮助所需的长期信息传递到独立的神经元。可以堆叠多个模块以形成多层模型,以提高性能。我们的贡献是:1)基于分裂和扭转策略,分别学习长期和短期依赖性的新的经常性模型,以及2)一种选择机制,以增强不同时间范围的分离和学习。进行了理论分析和广泛的实验以验证我们的模型的性能,我们还进行了简单的可视化实验和模型可解释性的消融分析。实验结果表明,所提出的DURNN模型不仅可以处理很长的序列(超过5000个时间步),而且还可以很好地处理短序列。与许多最先进的RNN模型相比,我们的模型表现出高效,更好的性能。

Recurrent neural networks (RNNs) are widely used as a memory model for sequence-related problems. Many variants of RNN have been proposed to solve the gradient problems of training RNNs and process long sequences. Although some classical models have been proposed, capturing long-term dependence while responding to short-term changes remains a challenge. To this problem, we propose a new model named Dual Recurrent Neural Networks (DuRNN). The DuRNN consists of two parts to learn the short-term dependence and progressively learn the long-term dependence. The first part is a recurrent neural network with constrained full recurrent connections to deal with short-term dependence in sequence and generate short-term memory. Another part is a recurrent neural network with independent recurrent connections which helps to learn long-term dependence and generate long-term memory. A selection mechanism is added between two parts to help the needed long-term information transfer to the independent neurons. Multiple modules can be stacked to form a multi-layer model for better performance. Our contributions are: 1) a new recurrent model developed based on the divide-and-conquer strategy to learn long and short-term dependence separately, and 2) a selection mechanism to enhance the separating and learning of different temporal scales of dependence. Both theoretical analysis and extensive experiments are conducted to validate the performance of our model, and we also conduct simple visualization experiments and ablation analyses for the model interpretability. Experimental results indicate that the proposed DuRNN model can handle not only very long sequences (over 5000 time steps), but also short sequences very well. Compared with many state-of-the-art RNN models, our model has demonstrated efficient and better performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源