论文标题

贝叶斯复发单元和前后算法

Bayesian Recurrent Units and the Forward-Backward Algorithm

论文作者

Bittar, Alexandre, Garner, Philip N.

论文摘要

使用贝叶斯定理,我们得出了单位复发以及类似于前后算法的后退递归。由此产生的贝叶斯复发单元可以在深度学习框架内集成为经常性神经网络,同时保留与隐藏的马尔可夫模型直接通信的概率解释。虽然贡献主要是理论上的,但对语音识别的实验表明,在最先进的经常性架构结束时添加派生单元可以在可训练的参数方面以非常低的成本来提高性能。

Using Bayes's theorem, we derive a unit-wise recurrence as well as a backward recursion similar to the forward-backward algorithm. The resulting Bayesian recurrent units can be integrated as recurrent neural networks within deep learning frameworks, while retaining a probabilistic interpretation from the direct correspondence with hidden Markov models. Whilst the contribution is mainly theoretical, experiments on speech recognition indicate that adding the derived units at the end of state-of-the-art recurrent architectures can improve the performance at a very low cost in terms of trainable parameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源