论文标题

扩展的随机梯度MCMC用于大规模贝叶斯变量选择

Extended Stochastic Gradient MCMC for Large-Scale Bayesian Variable Selection

论文作者

Song, Qifan, Sun, Yan, Ye, Mao, Liang, Faming

论文摘要

随机梯度马尔可夫链蒙特卡洛(MCMC)算法在贝叶斯计算中引起了大数据问题的关注,但它们仅适用于参数空间具有固定尺寸的一小类问题,而对数 - 孔子密度相对于参数是可区分的。本文提出了一个扩展的随机梯度MCMC Lgoriathm,通过引入适当的潜在变量,可以应用于更一般的大型贝叶斯计算问题,例如涉及尺寸跳跃和缺少数据的问题。数值研究表明,所提出的算法比传统的MCMC算法高度可扩展性和高效。所提出的算法极大地缓解了大数据计算中贝叶斯方法的痛苦。

Stochastic gradient Markov chain Monte Carlo (MCMC) algorithms have received much attention in Bayesian computing for big data problems, but they are only applicable to a small class of problems for which the parameter space has a fixed dimension and the log-posterior density is differentiable with respect to the parameters. This paper proposes an extended stochastic gradient MCMC lgoriathm which, by introducing appropriate latent variables, can be applied to more general large-scale Bayesian computing problems, such as those involving dimension jumping and missing data. Numerical studies show that the proposed algorithm is highly scalable and much more efficient than traditional MCMC algorithms. The proposed algorithms have much alleviated the pain of Bayesian methods in big data computing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源