论文标题
用于最小最大优化和游戏的自适应额外梯度方法
Adaptive extra-gradient methods for min-max optimization and games
论文作者
论文摘要
我们提出了一个新的Min-Max优化算法系列,该算法自动利用在早期迭代时观察到的梯度数据的几何形状,以在后来的迭代步骤中执行更有信息的额外梯度步骤。由于这种适应机制,提出的方法会自动检测问题是否顺利,而无需通过优化器进行任何事先调整。结果,算法同时达到了订单最佳的收敛速率,即,它在$ \ nathcal {o}(1/\ varepsilon)中收敛到$ \ varepsilon $ - 最佳的解决方案,在平滑问题中,在$ \ nthcal compal in n Norny It n Norny In中,一个。重要的是,这些保证不需要文献中通常假定的任何标准界限或Lipschitz的连续性条件。特别是,它们甚至适用于奇异性问题(例如资源分配问题等)。通过使用基于Finsler指标的几何设备和适当选择的Mirror-Prox模板,可以实现这种适应性,从而使我们能够为手头方法得出急剧的收敛速率。
We present a new family of min-max optimization algorithms that automatically exploit the geometry of the gradient data observed at earlier iterations to perform more informative extra-gradient steps in later ones. Thanks to this adaptation mechanism, the proposed method automatically detects whether the problem is smooth or not, without requiring any prior tuning by the optimizer. As a result, the algorithm simultaneously achieves order-optimal convergence rates, i.e., it converges to an $\varepsilon$-optimal solution within $\mathcal{O}(1/\varepsilon)$ iterations in smooth problems, and within $\mathcal{O}(1/\varepsilon^2)$ iterations in non-smooth ones. Importantly, these guarantees do not require any of the standard boundedness or Lipschitz continuity conditions that are typically assumed in the literature; in particular, they apply even to problems with singularities (such as resource allocation problems and the like). This adaptation is achieved through the use of a geometric apparatus based on Finsler metrics and a suitably chosen mirror-prox template that allows us to derive sharp convergence rates for the methods at hand.