论文标题

在线,信息丰富的MCMC稀疏与内核化的Stein差异

Online, Informative MCMC Thinning with Kernelized Stein Discrepancy

论文作者

Hawkins, Cole, Koppel, Alec, Zhang, Zheng

论文摘要

贝叶斯推论的基本挑战是目标分布的有效表示。许多非参数方法是通过使用马尔可夫链蒙特卡洛(MCMC)的变体对大量点进行采样的。我们提出了一个仅保留超过KSD阈值的后样品的MCMC变体,我们称之为KSD稀疏。我们为KSD稀疏的多种设置建立了收敛性和复杂性权衡,这是KSD阈值参数,样本量和其他问题参数的函数。最后,我们提供了与其他在线非参数贝叶斯方法产生低复杂性后验表示的实验比较,并观察到较高的一致性/复杂性权衡。代码可在github.com/colehawkins/ksd-thinning上找到。

A fundamental challenge in Bayesian inference is efficient representation of a target distribution. Many non-parametric approaches do so by sampling a large number of points using variants of Markov Chain Monte Carlo (MCMC). We propose an MCMC variant that retains only those posterior samples which exceed a KSD threshold, which we call KSD Thinning. We establish the convergence and complexity tradeoffs for several settings of KSD Thinning as a function of the KSD threshold parameter, sample size, and other problem parameters. Finally, we provide experimental comparisons against other online nonparametric Bayesian methods that generate low-complexity posterior representations, and observe superior consistency/complexity tradeoffs. Code is available at github.com/colehawkins/KSD-Thinning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源