论文标题
用于黑盒优化的数据驱动的贝叶斯非参数方法
A Data-Driven Bayesian Nonparametric Approach for Black-Box Optimization
论文作者
论文摘要
我们提出了一种随机黑盒函数的数据驱动的贝叶斯非参数方法(DABNO)。功能值取决于随机向量的分布。但是,在实践中,这种分布通常是复杂的,几乎不知道,并且通常是根据数据(随机向量实现)推断出来的。 dabno解释了估计分布并放松普遍使用的参数假设时会产生的有限数据误差,从而减少分布符合的误差。我们表明,dabno客观公式可以渐近地融合到真正的目标。我们进一步开发了一种替代辅助算法Dabno-K,以根据精心设计的内核有效地优化所提出的目标函数。数值实验是有几个合成和实际问题的,证明了该算法的经验全球收敛及其有限样本的性能。
We present a data-driven Bayesian nonparametric approach for global optimization (DaBNO) of stochastic black-box function. The function value depends on the distribution of a random vector. However, this distribution is usually complex and hardly known in practice, and is often inferred from data (realizations of random vectors). The DaBNO accounts for the finite-data error that arises when estimating the distribution and relaxes the commonly-used parametric assumption to reduce the distribution-misspecified error. We show that the DaBNO objective formulation can converge to the true objective asymptotically. We further develop a surrogate-assisted algorithm DaBNO-K to efficiently optimize the proposed objective function based on a carefully designed kernel. Numerical experiments are conducted with several synthetic and practical problems, demonstrating the empirical global convergence of this algorithm and its finite-sample performance.