论文标题

高斯过程回归中低级别近似值的好处?

How Good are Low-Rank Approximations in Gaussian Process Regression?

论文作者

Daskalakis, Constantinos, Dellaportas, Petros, Panos, Aristeidis

论文摘要

我们提供了由两个常见的低秩内近近似值产生的近似高斯过程(GP)回归的保证:基于随机傅立叶特征,并基于截断内核的Mercer扩展。特别是,我们将精确的GP与kullback-leibler差异绑定,这是由于上述描述的低级别近似值之一而产生的,以及它们相应的预测密度之间的差异,并且我们还绑定了预测性平均值之间的误差,并且在使用精确使用versus versus versus的近似值之间绑定了预测均值均值之间的误差。我们为模拟数据和标准基准提供了实验,以评估理论界限的有效性。

We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low-rank kernel approximations: based on random Fourier features, and based on truncating the kernel's Mercer expansion. In particular, we bound the Kullback-Leibler divergence between an exact GP and one resulting from one of the afore-described low-rank approximations to its kernel, as well as between their corresponding predictive densities, and we also bound the error between predictive mean vectors and between predictive covariance matrices computed using the exact versus using the approximate GP. We provide experiments on both simulated data and standard benchmarks to evaluate the effectiveness of our theoretical bounds.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源