论文标题
一种新颖的进化策略
A Novel Evolution Strategy with Directional Gaussian Smoothing for Blackbox Optimization
论文作者
论文摘要
我们使用新型的非局部梯度操作员提出了改进的进化策略(ES),以进行高维黑盒优化。由于基于蒙特卡洛(MC)的较高差异,具有$ d $二维高斯平滑的标准ES方法遭受了维数的诅咒。为了控制差异,高斯平滑通常在一个小区域中受到限制,因此现有的ES方法缺乏逃脱局部最小值所需的非局部探索能力。我们开发了一个具有定向高斯平滑(DGS)的非本地梯度操作员,以应对这一挑战。 DGS在$ \ Mathbb {r}^d $中沿$ d $正交方向进行1D非局部探索,每个探索将非局部定向衍生物定义为1D积分。然后,我们使用高斯 - 热线正交而不是MC采样,以估算$ D $ 1D的积分以确保高准确性(即小方差)。我们的方法实现了有效的非本地探索,以促进高维优化的全球搜索。我们在三组示例中演示了我们方法的出色性能,包括用于全球优化的基准功能以及现实世界的科学和工程应用。
We propose an improved evolution strategy (ES) using a novel nonlocal gradient operator for high-dimensional black-box optimization. Standard ES methods with $d$-dimensional Gaussian smoothing suffer from the curse of dimensionality due to the high variance of Monte Carlo (MC) based gradient estimators. To control the variance, Gaussian smoothing is usually limited in a small region, so existing ES methods lack nonlocal exploration ability required for escaping from local minima. We develop a nonlocal gradient operator with directional Gaussian smoothing (DGS) to address this challenge. The DGS conducts 1D nonlocal explorations along $d$ orthogonal directions in $\mathbb{R}^d$, each of which defines a nonlocal directional derivative as a 1D integral. We then use Gauss-Hermite quadrature, instead of MC sampling, to estimate the $d$ 1D integrals to ensure high accuracy (i.e., small variance). Our method enables effective nonlocal exploration to facilitate the global search in high-dimensional optimization. We demonstrate the superior performance of our method in three sets of examples, including benchmark functions for global optimization, and real-world science and engineering applications.