论文标题
随机梯度Langevin动力学的急剧均匀误差估计值
A sharp uniform-in-time error estimate for Stochastic Gradient Langevin Dynamics
论文作者
论文摘要
我们为随机梯度Langevin Dynamics(SGLD)建立了一个急剧均匀的时间误差估计,该动力学是一种广泛使用的采样算法。在温和的假设下,我们获得了一个均匀的$ O(η^2)$,限制为SGLD迭代与Langevin扩散之间的KL-Divergence,其中$η$是步长(或学习率)。我们的分析也适用于不同的步骤尺寸。因此,我们能够从瓦斯坦因或总变异距离方面,得出一个$ o(η)$绑定的sgld迭代量与兰格文扩散之间的距离。与现有的相关文献中的SGLD分析相比,我们的结果可以看作是一个重大改进。
We establish a sharp uniform-in-time error estimate for the Stochastic Gradient Langevin Dynamics (SGLD), which is a widely-used sampling algorithm. Under mild assumptions, we obtain a uniform-in-time $O(η^2)$ bound for the KL-divergence between the SGLD iteration and the Langevin diffusion, where $η$ is the step size (or learning rate). Our analysis is also valid for varying step sizes. Consequently, we are able to derive an $O(η)$ bound for the distance between the invariant measures of the SGLD iteration and the Langevin diffusion, in terms of Wasserstein or total variation distances. Our result can be viewed as a significant improvement compared with existing analysis for SGLD in related literature.