论文标题

单调变异不平等的乐观梯度方法的最后近题收敛

Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities

论文作者

Gorbunov, Eduard, Taylor, Adrien, Gidel, Gauthier

论文摘要

过去的外部外(PEG)[Popov,1980]方法,也称为乐观的梯度方法,随着机器学习的变分不平等公式的出现,人们对优化社区的兴趣有所提高。最近,在不受约束的情况下,Golowich等人。 [2020]证明,可以用Lipschitz Jacobian实现Lipschitz和单调操作员的$ O(1/N)$最后近期收敛速率。在这项工作中,通过通过潜在功能引入新的分析,我们表明(i)可以实现这种$ O(1/n)$的最后介绍融合,而无需对操作员的雅各布式进行任何假设,并且(ii)可以将其扩展到受约束的情况,即使在jacobian的Lipschitzness of jacobian的lipschitzness下也没有衍生。证明与Golowich等人已知的证据明显不同。 [2020],其发现是计算机辅助的。这些结果结束了对单调变异不平等的最后一次迭代融合的开放问题。

The Past Extragradient (PEG) [Popov, 1980] method, also known as the Optimistic Gradient method, has known a recent gain in interest in the optimization community with the emergence of variational inequality formulations for machine learning. Recently, in the unconstrained case, Golowich et al. [2020] proved that a $O(1/N)$ last-iterate convergence rate in terms of the squared norm of the operator can be achieved for Lipschitz and monotone operators with a Lipschitz Jacobian. In this work, by introducing a novel analysis through potential functions, we show that (i) this $O(1/N)$ last-iterate convergence can be achieved without any assumption on the Jacobian of the operator, and (ii) it can be extended to the constrained case, which was not derived before even under Lipschitzness of the Jacobian. The proof is significantly different from the one known from Golowich et al. [2020], and its discovery was computer-aided. Those results close the open question of the last iterate convergence of PEG for monotone variational inequalities.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源