论文标题
基于内核的学习技术的确定性误差界限
Deterministic error bounds for kernel-based learning techniques under bounded noise
论文作者
论文摘要
我们考虑从有限的噪声浪费样品集中重建功能的问题。分析了两种内核算法,即内核脊回归和$ \ varepsilon $ -Supporport向量回归。通过假设接地函数属于所选内核的再现内核希尔伯特空间,并且影响数据集的测量噪声是有界的,我们采用近似理论的观点来建立\ textIt {dectiNistic {dectInistic},这是两个模型的有限样本误差。最后,我们讨论了他们与高斯过程的联系,并提供了两个数值示例。在确定我们的不平等现象时,我们希望帮助带来非参数内核学习和系统识别的领域,以彼此紧密接近稳健的控制。
We consider the problem of reconstructing a function from a finite set of noise-corrupted samples. Two kernel algorithms are analyzed, namely kernel ridge regression and $\varepsilon$-support vector regression. By assuming the ground-truth function belongs to the reproducing kernel Hilbert space of the chosen kernel, and the measurement noise affecting the dataset is bounded, we adopt an approximation theory viewpoint to establish \textit{deterministic}, finite-sample error bounds for the two models. Finally, we discuss their connection with Gaussian processes and two numerical examples are provided. In establishing our inequalities, we hope to help bring the fields of non-parametric kernel learning and system identification for robust control closer to each other.