论文标题

FEDSKETCH:通过草图进行沟通效率和私人联合学习

FedSKETCH: Communication-Efficient and Private Federated Learning via Sketching

论文作者

Haddadpour, Farzin, Karimi, Belhal, Li, Ping, Li, Xiaoyun

论文摘要

沟通复杂性和隐私是联合学习的两个关键挑战,其目标是通过大量设备进行分布式学习。在这项工作中,我们介绍了FedSketch和Fedsketchgate算法,以解决联合学习的两个挑战,其中这些算法旨在分别用于均质和异质数据分布设置。关键的想法是使用计数草图压缩本地梯度的积累,因此,服务器无法访问提供隐私的梯度本身。此外,由于所使用的草图的较低维度,我们的方法也展示了通信效率属性。我们为上述方案提供了敏锐的融合保证。 最后,我们通过各种实验来备份理论。

Communication complexity and privacy are the two key challenges in Federated Learning where the goal is to perform a distributed learning through a large volume of devices. In this work, we introduce FedSKETCH and FedSKETCHGATE algorithms to address both challenges in Federated learning jointly, where these algorithms are intended to be used for homogeneous and heterogeneous data distribution settings respectively. The key idea is to compress the accumulation of local gradients using count sketch, therefore, the server does not have access to the gradients themselves which provides privacy. Furthermore, due to the lower dimension of sketching used, our method exhibits communication-efficiency property as well. We provide, for the aforementioned schemes, sharp convergence guarantees. Finally, we back up our theory with various set of experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源