论文标题

随机贝叶斯神经网络

Stochastic Bayesian Neural Networks

论文作者

Sagar, Abhinav

论文摘要

贝叶斯神经网络对权重进行变异推断,但是后验分布的计算仍然是一个挑战。我们的工作以贝叶斯神经网络的变异推理技术为基础,使用原始证据下限。在本文中,我们提出了一个随机的贝叶斯神经网络,在该网络中,我们使用新的目标函数将证据最大化,我们将其称为随机证据下限。我们使用测试RMSE和日志可能性作为评估指标对5个公开可用的UCI数据集评估我们的网络。我们证明,我们的工作不仅击败了先前的最新算法,而且还可以扩展到较大的数据集。

Bayesian neural networks perform variational inference over the weights however calculation of the posterior distribution remains a challenge. Our work builds on variational inference techniques for bayesian neural networks using the original Evidence Lower Bound. In this paper, we present a stochastic bayesian neural network in which we maximize Evidence Lower Bound using a new objective function which we name as Stochastic Evidence Lower Bound. We evaluate our network on 5 publicly available UCI datasets using test RMSE and log likelihood as the evaluation metrics. We demonstrate that our work not only beats the previous state of the art algorithms but is also scalable to larger datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源