论文标题

经典准牛顿方法的超线性收敛速率

Rates of superlinear convergence for classical quasi-Newton methods

论文作者

Rodomanov, Anton, Nesterov, Yurii

论文摘要

我们研究了经典准牛顿方法的局部收敛,以进行非线性优化。尽管很久以前就已经确定了这些方法会渐近地收敛,但相应的收敛速率仍然未知。在本文中,我们解决了这个问题。我们获得了标准的准Newton方法的首次显式非质量速率,该方法基于凸Broyden类的更新公式。特别是,对于众所周知的dfp和bfgs方法,我们获得了$(\ frac {n l^2} {n l^2} {μ^2 k})^{k/2} $和$(\ frac {n l} {n l} {μk} {μk})^k/k/2} $的$ k $ n是$ n是$ nisosor,强凸力参数和$ l $是梯度的Lipschitz常数。

We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form $(\frac{n L^2}{μ^2 k})^{k/2}$ and $(\frac{n L}{μk})^{k/2}$ respectively, where $k$ is the iteration counter, $n$ is the dimension of the problem, $μ$ is the strong convexity parameter, and $L$ is the Lipschitz constant of the gradient.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源