论文标题

联邦平均和Nesterov Fedavg的统一线性加速分析

A Unified Linear Speedup Analysis of Federated Averaging and Nesterov FedAvg

论文作者

Qu, Zhaonan, Lin, Kaixiang, Li, Zhaojian, Zhou, Jiayu, Zhou, Zhengyuan

论文摘要

联邦学习(FL)从一组参与设备中共同学习模型,而无需共享彼此的私人数据。非i.i.d的特征。整个网络的数据,设备参与度较低,沟通成本高以及数据仍然是私人的授权在理解FL算法的融合方面带来了挑战,尤其是关于融合如何与参与设备的数量缩放的方式。在本文中,我们专注于联邦平均(FedAvg),这是当今使用中最受欢迎,最有效的FL算法之一,以及其Nesterov加速变体,并系统地研究了其与非I.D.i.i.d.i.d.d.i的融合量表如何与参与设备的收敛量表。数据和部分参与凸设置。我们提供了统一的分析,该分析在强凸,凸面和过度参数强烈凸出问题下为FedAvg建立了融合保证。我们表明,在每种情况下,FedAvg都具有线性加速,尽管收敛速度和沟通效率不同。对于强烈的凸和凸问题,我们还表征了Nesterov加速的FedAvg算法的相应收敛速率,这是FedAvg在凸面设置中的动量变体的首次线性加速。各种环境中算法的经验研究支持了我们的理论结果。

Federated learning (FL) learns a model jointly from a set of participating devices without sharing each other's privately held data. The characteristics of non-i.i.d. data across the network, low device participation, high communication costs, and the mandate that data remain private bring challenges in understanding the convergence of FL algorithms, particularly regarding how convergence scales with the number of participating devices. In this paper, we focus on Federated Averaging (FedAvg), one of the most popular and effective FL algorithms in use today, as well as its Nesterov accelerated variant, and conduct a systematic study of how their convergence scale with the number of participating devices under non-i.i.d. data and partial participation in convex settings. We provide a unified analysis that establishes convergence guarantees for FedAvg under strongly convex, convex, and overparameterized strongly convex problems. We show that FedAvg enjoys linear speedup in each case, although with different convergence rates and communication efficiencies. For strongly convex and convex problems, we also characterize the corresponding convergence rates for the Nesterov accelerated FedAvg algorithm, which are the first linear speedup guarantees for momentum variants of FedAvg in convex settings. Empirical studies of the algorithms in various settings have supported our theoretical results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源