论文标题

一类多个最小问题的无衍生化全球最小化

Derivative-free global minimization for a class of multiple minima problems

论文作者

Luo, Xiaopeng, Xu, Xin, Dong, Daoyi

论文摘要

我们证明,基于有限差异的无衍生化下降(FD-DFD)方法具有为一类多个最小值问题找到全局最小值的能力。我们的主要结果表明,对于一类多个最小值目标,这些目标是从强烈凸的功能延伸到Lipschitz-continule梯度的,FD-DFD的迭代会收敛到全局最小化$ x _*$,线性收敛$ \ | x__ {k+1} -X} -x _*} -x _**_*_*_*\ | \ | | x_1-x _*\ | _2^2 $对于固定的$ 0 <ρ<1 $和任何初始迭代$ x_1 \ in \ mathbb {r}^d $当正确选择参数时。由于触电成本(即函数评估的数量)是固定的,几乎与尺寸$ d $无关,因此FD-DFD算法具有复杂性绑定的$ \ Mathcal {o}(\ log \ log \ frac {1} $ε> 0 $。各个维度的数值实验从$ 5 $到$ 500 $都证明了FD-DFD方法的好处。

We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer $x_*$ with the linear convergence $\|x_{k+1}-x_*\|_2^2\leqslantρ^k \|x_1-x_*\|_2^2$ for a fixed $0<ρ<1$ and any initial iteration $x_1\in\mathbb{R}^d$ when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension $d$, the FD-DFD algorithm has a complexity bound $\mathcal{O}(\log\frac{1}ε)$ for finding a point $x$ such that the optimality gap $\|x-x_*\|_2^2$ is less than $ε>0$. Numerical experiments in various dimensions from $5$ to $500$ demonstrate the benefits of the FD-DFD method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源