论文标题

格拉曼·斯坦(Grassmann Stein)变异梯度下降

Grassmann Stein Variational Gradient Descent

论文作者

Liu, Xing, Zhu, Harrison, Ton, Jean-François, Wynne, George, Duncan, Andrew

论文摘要

Stein变分梯度下降(SVGD)是一种确定性粒子推理算法,为马尔可夫链蒙特卡洛提供了有效替代方案。但是,当目标分布的维度较高时,已经发现SVGD遭受差异的低估。最近的发展已主张将得分函数和数据投影到实际线上以避开此问题,尽管这可能严重高估了认知(模型)的不确定性。在这项工作中,我们建议格拉斯曼·斯坦(Grassmann Stein)变分梯度下降(GSVGD)作为替代方法,该方法允许投影在任意维度子空间上。与依赖降低维度降低的SVGD的其他变体相比,GSVGD同时更新了分数功能和数据的投影仪,并且最佳投影仪是通过耦合的Grassmann-vared扩散过程来确定的,该扩散过程探索了有利的子空间。我们的理论和实验结果都表明,GSVGD在具有内在的低维结构的高维问题中享有有效的状态空间探索。

Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo. However, SVGD has been found to suffer from variance underestimation when the dimensionality of the target distribution is high. Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue, although this can severely overestimate the epistemic (model) uncertainty. In this work, we propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces. Compared with other variants of SVGD that rely on dimensionality reduction, GSVGD updates the projectors simultaneously for the score function and the data, and the optimal projectors are determined through a coupled Grassmann-valued diffusion process which explores favourable subspaces. Both our theoretical and experimental results suggest that GSVGD enjoys efficient state-space exploration in high-dimensional problems that have an intrinsic low-dimensional structure.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源