论文标题
重新启动的Anderson加速梯度方法的下降特性
Descent Properties of an Anderson Accelerated Gradient Method With Restarting
论文作者
论文摘要
安德森加速度(AA)是一种流行的加速技术,可增强定点迭代的收敛性。 AA方法的分析通常集中在相应的定点残留物的收敛行为上,而目前尚不清楚沿加速迭代的基础目标函数值的行为。在本文中,我们研究了AA的局部特性,并在功能值方面将重新启动应用于基本梯度方案。具体而言,我们表明,重新启动的AA是一种局部下降方法,它可以比梯度方法更快地降低目标函数。从理论上讲,这些新结果在将启发式下降条件用于全球化时支持AA的良好数值性能,并且它们提供了对AA收敛分析的新观点,该观点更适合非convex优化问题。进行数值实验以说明我们的理论发现。
Anderson Acceleration (AA) is a popular acceleration technique to enhance the convergence of fixed-point iterations. The analysis of AA approaches typically focuses on the convergence behavior of a corresponding fixed-point residual, while the behavior of the underlying objective function values along the accelerated iterates is currently not well understood. In this paper, we investigate local properties of AA with restarting applied to a basic gradient scheme in terms of function values. Specifically, we show that AA with restarting is a local descent method and that it can decrease the objective function faster than the gradient method. These new results theoretically support the good numerical performance of AA when heuristic descent conditions are used for globalization and they provide a novel perspective on the convergence analysis of AA that is more amenable to nonconvex optimization problems. Numerical experiments are conducted to illustrate our theoretical findings.