论文标题

通过时间尺度和最陡下降的快速凸优化

Fast convex optimization via time scale and averaging of the steepest descent

论文作者

Attouch, Hedy, Bot, Radu Ioan, Nguyen, Dang-Khoa

论文摘要

在希尔伯特环境中,我们开发了一种基于梯度的动态方法,用于快速解决凸优化问题。通过将时间缩放,平均和扰动技术应用于连续的陡峭下降(SD),我们获得了Nesterov和山沟方法的高分辨率ODES。这些动力学涉及渐近消失的粘性阻尼和Hessian驱动的阻尼(以显式或隐式形式)。数学分析不需要为惯性系统开发Lyapunov分析。我们只是为(SD)及其外部扰动版本开发经典收敛结果,然后使用差分和积分计算的工具,包括Jensen的不平等。该方法是灵活的,通过说明我们显示了如何从优化中的其他重要动态开始使用。我们考虑了初始动力学是正规化的牛顿方法的情况,然后,启动动力学是与凸较低的半连续电位相关的差分包含的情况,最后我们证明该技术可以自然扩展到单调cocoercive操作员的情况下。我们的方法导致平行算法结果,在快速梯度和近端算法的情况下,我们研究了这些结果。我们的平均技术显示了Nesterov和山沟方法之间的新联系。

In a Hilbert setting, we develop a gradient-based dynamic approach for fast solving convex optimization problems. By applying time scaling, averaging, and perturbation techniques to the continuous steepest descent (SD), we obtain high-resolution ODEs of the Nesterov and Ravine methods. These dynamics involve asymptotically vanishing viscous damping and Hessian driven damping (either in explicit or implicit form). Mathematical analysis does not require developing a Lyapunov analysis for inertial systems. We simply exploit classical convergence results for (SD) and its external perturbation version, then use tools of differential and integral calculus, including Jensen's inequality. The method is flexible and by way of illustration we show how it applies starting from other important dynamics in optimization. We consider the case where the initial dynamics is the regularized Newton method, then the case where the starting dynamics is the differential inclusion associated with a convex lower semicontinuous potential, and finally we show that the technique can be naturally extended to the case of a monotone cocoercive operator. Our approach leads to parallel algorithmic results, which we study in the case of fast gradient and proximal algorithms. Our averaging technique shows new links between the Nesterov and Ravine methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源