论文标题

平均基于内核的贝叶斯正交误差界限

On Average-Case Error Bounds for Kernel-Based Bayesian Quadrature

论文作者

Cai, Xu, Lam, Chi Thanh, Scarlett, Jonathan

论文摘要

在本文中,我们研究了{\ em贝叶斯正交}(bq)的误差界限,重点是嘈杂的设置,随机算法和平均案例性能度量。我们试图近似{\ em重现的内核Hilbert Space}(RKHS)中函数的积分,尤其是专注于Matérn-$ν$和平方指数(SE)内核,并带有来自功能的样本,其函数可能会被高斯噪声损坏。我们提供了一个两步的元容量,它是将平均案例正交误差与$ l^2 $ function近似误差联系起来的一般工具。当专门针对Matérn内核时,我们恢复了现有的近乎最佳错误率,同时避免了反复采样点的现有方法。当专门针对其他设置时,我们将获得新的平均案例结果,用于设置,包括带有噪声的SE内核和具有错误指定的Matérn内核。最后,我们提出了与现有算法无关的下限,与现有算法具有更大的通用性和/或提供不同的证据。

In this paper, we study error bounds for {\em Bayesian quadrature} (BQ), with an emphasis on noisy settings, randomized algorithms, and average-case performance measures. We seek to approximate the integral of functions in a {\em Reproducing Kernel Hilbert Space} (RKHS), particularly focusing on the Matérn-$ν$ and squared exponential (SE) kernels, with samples from the function potentially being corrupted by Gaussian noise. We provide a two-step meta-algorithm that serves as a general tool for relating the average-case quadrature error with the $L^2$-function approximation error. When specialized to the Matérn kernel, we recover an existing near-optimal error rate while avoiding the existing method of repeatedly sampling points. When specialized to other settings, we obtain new average-case results for settings including the SE kernel with noise and the Matérn kernel with misspecification. Finally, we present algorithm-independent lower bounds that have greater generality and/or give distinct proofs compared to existing ones.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源