论文标题

加速使用局部Lipschitz连续梯度的凸优化的一阶方法

Accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient

论文作者

Lu, Zhaosong, Mei, Sanyou

论文摘要

在本文中,我们开发了使用局部Lipschitz连续梯度(LLCG)进行加速的一阶方法,以超越了lipschitz连续梯度的凸面优化。特别是,我们首先考虑使用LLCG进行无约束的凸优化,并提出求解它的加速近端梯度(APG)方法。所提出的APG方法配备了可验证的终止标准,并享受$ {\ cal o}的操作复杂性(\ varepsilon^{ - 1/2} \ log \ log \ varepsilon^{ - 1} { - 1}}分别解决不受约束的凸和强烈凸优化问题的解决方案。然后,我们考虑使用LLCG进行约束的凸优化,并提出了一种近端增强拉格朗日方法,通过应用我们提出的APG方法之一来求解一系列近端增强Lagrangian子问题,以求解它。所得的方法配备了可验证的终止标准,并具有$ {\ cal o}(\ varepsilon^{ - 1} \ log \ varepsilon^{ - 1})$和$ {\ cal o}(\ cal o}(\ varepsilon^$ 1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/1/2) $ \ varepsilon $ -KKT解决方案分别是受约束的凸和强烈凸优化问题的解决方案。本文中所有提出的方法均为无参数或几乎不含参数的方法,只是需要有关凸度参数的知识。此外,提出了初步的数值结果,以证明我们提出的方法的性能。据我们所知,没有进行先前的研究来研究具有复杂性保证的加速一阶方法,可与LLCG进行凸的优化。本文获得的所有复杂性结果都是新的。

In this paper we develop accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient (LLCG), which is beyond the well-studied class of convex optimization with Lipschitz continuous gradient. In particular, we first consider unconstrained convex optimization with LLCG and propose accelerated proximal gradient (APG) methods for solving it. The proposed APG methods are equipped with a verifiable termination criterion and enjoy an operation complexity of ${\cal O}(\varepsilon^{-1/2}\log \varepsilon^{-1})$ and ${\cal O}(\log \varepsilon^{-1})$ for finding an $\varepsilon$-residual solution of an unconstrained convex and strongly convex optimization problem, respectively. We then consider constrained convex optimization with LLCG and propose an first-order proximal augmented Lagrangian method for solving it by applying one of our proposed APG methods to approximately solve a sequence of proximal augmented Lagrangian subproblems. The resulting method is equipped with a verifiable termination criterion and enjoys an operation complexity of ${\cal O}(\varepsilon^{-1}\log \varepsilon^{-1})$ and ${\cal O}(\varepsilon^{-1/2}\log \varepsilon^{-1})$ for finding an $\varepsilon$-KKT solution of a constrained convex and strongly convex optimization problem, respectively. All the proposed methods in this paper are parameter-free or almost parameter-free except that the knowledge on convexity parameter is required. In addition, preliminary numerical results are presented to demonstrate the performance of our proposed methods. To the best of our knowledge, no prior studies were conducted to investigate accelerated first-order methods with complexity guarantees for convex optimization with LLCG. All the complexity results obtained in this paper are new.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源