论文标题
通过紧凑型内核的参数族的稀疏高斯过程
Sparse Gaussian Processes via Parametric Families of Compactly-supported Kernels
论文作者
论文摘要
高斯流程是概率机器学习的强大模型,但其$ O(n^3)$推理复杂性的应用限制。我们提出了一种具有紧凑的空间支持的核函数的参数族的方法,该函数会产生自然稀疏的核矩阵,并通过稀疏的线性代数启用快速的高斯工艺推断。这些家族概括了已知的紧凑型核函数,例如Wendland多项式。可以使用最大似然估计来从数据中学到该核心家族的参数。另外,我们可以使用凸优化快速计算目标内核的紧凑型近似值。我们证明,当直接从目标GP绘制的数据建模时,这些近似值会在精确模型上产生最小误差,并且可以在现实世界信号重建任务上超越传统的GP内核,同时表现出亚属性推理复杂性。
Gaussian processes are powerful models for probabilistic machine learning, but are limited in application by their $O(N^3)$ inference complexity. We propose a method for deriving parametric families of kernel functions with compact spatial support, which yield naturally sparse kernel matrices and enable fast Gaussian process inference via sparse linear algebra. These families generalize known compactly-supported kernel functions, such as the Wendland polynomials. The parameters of this family of kernels can be learned from data using maximum likelihood estimation. Alternatively, we can quickly compute compact approximations of a target kernel using convex optimization. We demonstrate that these approximations incur minimal error over the exact models when modeling data drawn directly from a target GP, and can out-perform the traditional GP kernels on real-world signal reconstruction tasks, while exhibiting sub-quadratic inference complexity.