论文标题

一种全球收敛的快速迭代收缩率阈值算法,具有单一和多目标凸优化的新动量因子

A globally convergent fast iterative shrinkage-thresholding algorithm with a new momentum factor for single and multi-objective convex optimization

论文作者

Tanabe, Hiroki, Fukuda, Ellen H., Yamashita, Nobuo

论文摘要

凸复合优化,该优化最大程度地减少了由可区分函数和凸功能的总和表示的目标函数,被广泛用于机器学习和信号/图像处理中。快速迭代的收缩阈值算法(FISTA)是解决此问题的典型方法,全球收敛速率为$ O(1 / k^2)$。最近,这已扩展到多目标优化,以及$ O(1 / k^2)$全局收敛速率的证明。但是,其动量因素是经典的,并且尚未证明其迭代率的收敛性。在这项工作中,引入一些其他超参数$(a,b)$,我们提出了另一种具有一般动量因子的加速近端梯度方法,即使对于单目标案例,这也是新的。我们表明,我们提出的方法还具有任何$(a,b)$的全球收敛速率(1/k^2)$,此外,当$ a $是正时,生成的迭代序列会收敛到弱帕累托解决方案,这是有限时间分歧识别的必不可少的属性。此外,我们报告了各种$(a,b)$的数值结果,这表明其中一些选择比经典的动量因素更好。

Convex-composite optimization, which minimizes an objective function represented by the sum of a differentiable function and a convex one, is widely used in machine learning and signal/image processing. Fast Iterative Shrinkage Thresholding Algorithm (FISTA) is a typical method for solving this problem and has a global convergence rate of $O(1 / k^2)$. Recently, this has been extended to multi-objective optimization, together with the proof of the $O(1 / k^2)$ global convergence rate. However, its momentum factor is classical, and the convergence of its iterates has not been proven. In this work, introducing some additional hyperparameters $(a, b)$, we propose another accelerated proximal gradient method with a general momentum factor, which is new even for the single-objective cases. We show that our proposed method also has a global convergence rate of $O(1/k^2)$ for any $(a,b)$, and further that the generated sequence of iterates converges to a weak Pareto solution when $a$ is positive, an essential property for the finite-time manifold identification. Moreover, we report numerical results with various $(a,b)$, showing that some of these choices give better results than the classical momentum factors.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源