论文标题

限制下高斯过程的顺序结构和尺寸缩小

Sequential construction and dimension reduction of Gaussian processes under constraints

论文作者

Bachoc, François, Lopera, Andrés F. López, Roustant, Olivier

论文摘要

在建模昂贵的黑匣子功能时,考虑到界限,单调性或凸度等不平等约束,例如界限,单调性或凸度。在这方面,有限维的高斯流程(GP)回归模型带来了宝贵的解决方案,因为它们保证到处都可以满足不平等约束。然而,这些模型目前仅限于小维度情况(达到维度5)。在解决此问题时,我们介绍了MaxMod算法,该算法依次插入一维结或添加了活动变量,从而在同一时间降低尺寸降低和有效的结中分配。我们证明了该算法的收敛性。在证明的中介步骤中,我们提出了多种施加的概念并研究其特性。我们还证明,当输入空间中的结不致密时,有限维度GP的融合并扩展了最近的文献。借助模拟和真实数据,我们证明了MaxMod算法在更高的维度(至少在维度20中)仍然有效,并且比最新的其他受约束的GP模型需要更少的结,才能达到给定的近似误差。

Accounting for inequality constraints, such as boundedness, monotonicity or convexity, is challenging when modeling costly-to-evaluate black box functions. In this regard, finite-dimensional Gaussian process (GP) regression models bring a valuable solution, as they guarantee that the inequality constraints are satisfied everywhere. Nevertheless, these models are currently restricted to small dimensional situations (up to dimension 5). Addressing this issue, we introduce the MaxMod algorithm that sequentially inserts one-dimensional knots or adds active variables, thereby performing at the same time dimension reduction and efficient knot allocation. We prove the convergence of this algorithm. In intermediary steps of the proof, we propose the notion of multi-affine extension and study its properties. We also prove the convergence of finite-dimensional GPs, when the knots are not dense in the input space, extending the recent literature. With simulated and real data, we demonstrate that the MaxMod algorithm remains efficient in higher dimension (at least in dimension 20), and needs fewer knots than other constrained GP models from the state-of-the-art, to reach a given approximation error.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源