论文标题
Minimax最佳内核操作员通过多级培训学习
Minimax Optimal Kernel Operator Learning via Multilevel Training
论文作者
论文摘要
无限维功能空间之间的学习映射已在机器学习的许多学科中取得了经验成功,包括生成建模,功能数据分析,因果推理和多方面的增强增强学习。在本文中,我们研究了两个无限维索博勒(Sobolev)繁殖内核希尔伯特(Hilbert)空间之间学习希尔伯特·史基特(Hilbert-Schmidt)操作员的统计限制。我们根据Sobolev Hilbert-Schmidt规范建立了信息理论的下限,并表明正规化学习了偏见轮廓以下的光谱成分,而忽略了差异高于方差轮廓的频谱成分可以达到最佳学习率。同时,偏置和方差轮廓之间的光谱成分为我们设计计算可行的机器学习算法的灵活性。基于此观察结果,我们开发了一种多级内核操作员学习算法,该算法在无限维函数空间之间学习线性运算符时是最佳的。
Learning mappings between infinite-dimensional function spaces has achieved empirical success in many disciplines of machine learning, including generative modeling, functional data analysis, causal inference, and multi-agent reinforcement learning. In this paper, we study the statistical limit of learning a Hilbert-Schmidt operator between two infinite-dimensional Sobolev reproducing kernel Hilbert spaces. We establish the information-theoretic lower bound in terms of the Sobolev Hilbert-Schmidt norm and show that a regularization that learns the spectral components below the bias contour and ignores the ones that are above the variance contour can achieve the optimal learning rate. At the same time, the spectral components between the bias and variance contours give us flexibility in designing computationally feasible machine learning algorithms. Based on this observation, we develop a multilevel kernel operator learning algorithm that is optimal when learning linear operators between infinite-dimensional function spaces.