论文标题

级别集的亚差异误差界限和可变近端梯度方法的线性收敛

Level-set Subdifferential Error Bounds and Linear Convergence of Variable Bregman Proximal Gradient Method

论文作者

Zhu, Daoli, Deng, Sien, Li, Minghua, Zhao, Lei

论文摘要

在这项工作中,我们开发了一个级别的亚分化误差约束条件,旨在针对Bregman近端梯度(VBPG)方法的收敛率分析,以用于广泛的非平滑和非convex优化问题。事实证明,上述条件保证了VBPG的线性收敛性,并且比Kurdyka-lojasiewicz属性弱,弱度量次级性和Bregman近端误差结合。在此过程中,我们能够为保持级别集的细分误差界限得出许多可验证的条件,以及相对于非平滑和非convex优化问题的级别设置的线性收敛的必要条件和足够条件。 The newly established results not only enable us to show that any accumulation point of the sequence generated by VBPG is at least a critical point of the limiting subdifferential or even acritical point of the proximal subdifferential with a fixed Bregman function in each iteration, but also provide a fresh perspective that allows us to explore inner-connections among many known sufficient conditions for linear convergence of various first-order methods.

In this work, we develop a level-set subdifferential error bound condition aiming towards convergence rate analysis of a variable Bregman proximal gradient (VBPG) method for a broad class of nonsmooth and nonconvex optimization problems. It is proved that the aforementioned condition guarantees linear convergence of VBPG, and is weaker than Kurdyka-Lojasiewicz property, weak metric subregularity and Bregman proximal error bound. Along the way, we are able to derive a number of verifiable conditions for level-set subdifferential error bounds to hold, and necessary conditions and sufficient conditions for linear convergence relative to a level set for nonsmooth and nonconvex optimization problems. The newly established results not only enable us to show that any accumulation point of the sequence generated by VBPG is at least a critical point of the limiting subdifferential or even acritical point of the proximal subdifferential with a fixed Bregman function in each iteration, but also provide a fresh perspective that allows us to explore inner-connections among many known sufficient conditions for linear convergence of various first-order methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源