论文标题

通过惯性梯度样系统采用Nesterov加速的快速多目标梯度方法

Fast Multiobjective Gradient Methods with Nesterov Acceleration via Inertial Gradient-like Systems

论文作者

Sonntag, Konstantin, Peitz, Sebastian

论文摘要

我们得出有效的算法来计算一般希尔伯特空间中的平滑,凸和无约束的多目标优化问题,以计算弱帕累托最佳解决方案。为此,我们在多目标设置中定义了一种新型的惯性梯度样动力系统,该设置的轨迹薄弱地收敛到帕累托最佳解决方案。该系统的离散化产生了一种惯性的多目标算法,该算法会生成弱收敛到帕累托最佳解决方案的序列。与普通的多物镜最陡后方法相比,我们采用Nesterov加速度来定义具有提高收敛速率的算法(算法1)。 通过避免使用二次子问题来计算所有目标函数的共同步骤方向,可以进一步提高效率,这通常是在一阶方法中所必需的。使用我们的惯性梯度样动力学系统的不同离散化,我们获得了一个加速的多目标梯度方法,该方法不需要每个步骤中的子问题解决方案(算法2)。尽管该算法一般不会收敛,但它在测试问题上会产生良好的结果,同时比标准陡峭下降的速度更快,而最陡峭的下降量则比两到三个数量级。

We derive efficient algorithms to compute weakly Pareto optimal solutions for smooth, convex and unconstrained multiobjective optimization problems in general Hilbert spaces. To this end, we define a novel inertial gradient-like dynamical system in the multiobjective setting, whose trajectories converge weakly to Pareto optimal solutions. Discretization of this system yields an inertial multiobjective algorithm which generates sequences that converge weakly to Pareto optimal solutions. We employ Nesterov acceleration to define an algorithm with an improved convergence rate compared to the plain multiobjective steepest descent method (Algorithm 1). A further improvement in terms of efficiency is achieved by avoiding the solution of a quadratic subproblem to compute a common step direction for all objective functions, which is usually required in first order methods. Using a different discretization of our inertial gradient-like dynamical system, we obtain an accelerated multiobjective gradient method that does not require the solution of a subproblem in each step (Algorithm 2). While this algorithm does not converge in general, it yields good results on test problems while being faster than standard steepest descent by two to three orders of magnitude.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源