论文标题

隐藏的马尔可夫神经网络

Hidden Markov Neural Networks

论文作者

Rimella, Lorenzo, Whiteley, Nick

论文摘要

我们定义了一个不断发展的贝叶斯神经网络,称为隐藏的马尔可夫神经网络,该网络解决了时间序列预测和持续学习的关键挑战:在适应新数据和适当忘记过时的信息之间达到平衡。这是通过将神经网络的权重建模为隐藏的马尔可夫模型的隐藏状态来实现的,其观察到的过程由可用数据定义。使用过滤算法来学习重量上不变的后验分布的变异近似。通过利用Backprop的顺序变体,并具有更强的正则化技术,称为变异滴定,隐藏的Markov神经网络实现了强大的正则化和可扩展的推断。视频中有关MNIST,动态分类任务和下一框架预测的实验表明,隐藏的Markov神经网络提供了强大的预测性能,同时实现了有效的不确定性量化。

We define an evolving in-time Bayesian neural network called a Hidden Markov Neural Network, which addresses the crucial challenge in time-series forecasting and continual learning: striking a balance between adapting to new data and appropriately forgetting outdated information. This is achieved by modelling the weights of a neural network as the hidden states of a Hidden Markov model, with the observed process defined by the available data. A filtering algorithm is employed to learn a variational approximation of the evolving-in-time posterior distribution over the weights. By leveraging a sequential variant of Bayes by Backprop, enriched with a stronger regularization technique called variational DropConnect, Hidden Markov Neural Networks achieve robust regularization and scalable inference. Experiments on MNIST, dynamic classification tasks, and next-frame forecasting in videos demonstrate that Hidden Markov Neural Networks provide strong predictive performance while enabling effective uncertainty quantification.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源