论文标题
预计的Stein变分梯度下降
Projected Stein Variational Gradient Descent
论文作者
论文摘要
维度的诅咒是高维度在贝叶斯推论中的长期挑战。在这项工作中,我们提出了一种预计的Stein变分梯度下降(PSVGD)方法,以通过利用此类问题的不良性来利用数据知情的子空间的基本属性来克服这一挑战。我们使用log-oikelihood的梯度信息矩阵自适应地构建子空间,并将PSVGD应用于参数投影的较低维数系数。证明该方法比SVGD更准确,更有效。通过实验,参数,样本,数据点和处理器核心的数量,参数尺寸的尺寸从数百到数万美元不等。
The curse of dimensionality is a longstanding challenge in Bayesian inference in high dimensions. In this work, we propose a projected Stein variational gradient descent (pSVGD) method to overcome this challenge by exploiting the fundamental property of intrinsic low dimensionality of the data informed subspace stemming from ill-posedness of such problems. We adaptively construct the subspace using a gradient information matrix of the log-likelihood, and apply pSVGD to the much lower-dimensional coefficients of the parameter projection. The method is demonstrated to be more accurate and efficient than SVGD. It is also shown to be more scalable with respect to the number of parameters, samples, data points, and processor cores via experiments with parameters dimensions ranging from the hundreds to the tens of thousands.