论文标题
哈密顿蒙特卡洛(Hamiltonian Monte Carlo)用于高效高斯采样:长和随机的步骤
Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps
论文作者
论文摘要
汉密尔顿蒙特卡洛(HMC)是一种马尔可夫链算法,用于从具有$ f $的梯度的高度分布$ e^{ - f(x)} $采样。一个特殊的感兴趣的情况是带有协方差矩阵$σ$的$ d $维高斯分布,在这种情况下$ f(x)= x^\ topσ^{ - 1} x $。我们表明,HMC可以使用$ \ varepsilon $ -close的分布中的总变化距离进行采样,并使用$ \ widetilde {o}(\sqrtκd^{1/4} \ log(1/\ varepsilon))$渐变查询,其中$κ$是$κ$是$之一的条件号。我们的算法对哈密顿动力学使用了长时间和随机的整合时间。这与最近的结果(并受到)形成鲜明对比,该结果给出了$ \wideTildeΩ(κD^{1/2})$查询的HMC的下限,即使是固定的集成时间,即使对于高斯案例也是如此。
Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high-dimensional distribution with density $e^{-f(x)}$, given access to the gradient of $f$. A particular case of interest is that of a $d$-dimensional Gaussian distribution with covariance matrix $Σ$, in which case $f(x) = x^\top Σ^{-1} x$. We show that HMC can sample from a distribution that is $\varepsilon$-close in total variation distance using $\widetilde{O}(\sqrtκ d^{1/4} \log(1/\varepsilon))$ gradient queries, where $κ$ is the condition number of $Σ$. Our algorithm uses long and random integration times for the Hamiltonian dynamics. This contrasts with (and was motivated by) recent results that give an $\widetildeΩ(κd^{1/2})$ query lower bound for HMC with fixed integration times, even for the Gaussian case.