论文标题

关于具有多个共识和梯度步骤的嵌套分散梯度方法的收敛

On the Convergence of Nested Decentralized Gradient Methods with Multiple Consensus and Gradient Steps

论文作者

Berahas, Albert S., Bollapragada, Raghu, Wei, Ermin

论文摘要

在本文中,我们考虑最大程度地减少分布式设置中的本地凸目标功能的总和,在分布式设置中,通信和/或计算的成本可能很昂贵。我们扩展并概括了一类基于嵌套梯度的分布式算法的分析(近dgd; Berahas,Bollapragada,Keskar和Wei,2018年),以在每次迭代中占多个梯度步骤。我们展示了执行多个梯度步骤对收敛速率和收敛域大小的影响,并证明R线性收敛到精确溶液具有固定数量的梯度步骤和增加的共识步骤的效果。我们测试了广义方法对二次功能的性能,并在迭代,梯度评估的数量,通信数量和成本方面显示了多个共识和梯度步骤的影响。

In this paper, we consider minimizing a sum of local convex objective functions in a distributed setting, where the cost of communication and/or computation can be expensive. We extend and generalize the analysis for a class of nested gradient-based distributed algorithms (NEAR-DGD; Berahas, Bollapragada, Keskar and Wei, 2018) to account for multiple gradient steps at every iteration. We show the effect of performing multiple gradient steps on the rate of convergence and on the size of the neighborhood of convergence, and prove R-Linear convergence to the exact solution with a fixed number of gradient steps and increasing number of consensus steps. We test the performance of the generalized method on quadratic functions and show the effect of multiple consensus and gradient steps in terms of iterations, number of gradient evaluations, number of communications and cost.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源