论文标题

一种有效的随机算法,用于分散的非convex-rong-concave minimax优化

An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

论文作者

Chen, Lesi, Ye, Haishan, Luo, Luo

论文摘要

本文研究了多代理网络上的随机非convex-rong-concave minimax优化。我们提出了一种有效的算法,称为分散递归梯度下降方法(Dream),该方法获得了找到$ε$ - 定位点的最著名理论保证。具体而言,它需要$ \ MATHCAL {o}(\ min(κ^3ε^{ - 3},κ^2 \ sqrt {n} sqrt {n}ε^{ - 2})$随机的第一级Oracle(sfo)呼叫和$ \ tilde {回合,其中$κ$是条件编号,而$ n $是单个功能的总数。我们的数值实验也验证了梦想优于以前的方法。

This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the $ε$-stationary points. Concretely, it requires $\mathcal{O}(\min (κ^3ε^{-3},κ^2 \sqrt{N} ε^{-2} ))$ stochastic first-order oracle (SFO) calls and $\tilde{\mathcal{O}}(κ^2 ε^{-2})$ communication rounds, where $κ$ is the condition number and $N$ is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源