论文标题

AppFL:用于隐私的联合学习的开源软件框架

APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning

论文作者

Ryu, Minseok, Kim, Youngdae, Kim, Kibaek, Madduri, Ravi K.

论文摘要

联合学习(FL)可以在不同站点进行培训模型,并更新从培训的权重,而不是像古典机器学习一样将数据传输到中心位置和培训。 FL功能对于诸如生物医学和智能电网之类的领域尤为重要,在该领域由于政策挑战而无法自由共享或存储在中心位置。得益于从分散数据集中学习的能力,FL现在是一个快速增长的研究领域,并且已经开发了许多FL框架。在这项工作中,我们介绍了Appfl,即Argonne隐私的联合学习框架。 APPFL允许用户利用实施的隐私保护算法,实现新算法,并使用隐私保护技术模拟和部署各种FL算法。模块化框架使用户可以自定义算法,隐私,通信协议,神经网络模型和用户数据的组件。我们还基于乘数的不精确方向方法提出了一种新的通信效率算法。该算法所需的服务器与客户端之间的通信要比当前的最新状态要少得多。我们通过在不同的计算环境上使用多种算法和数据集,证明了APPFL的计算功能,包括各种测试数据集上的私有FL及其可扩展性。

Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in classical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, where data may not be shared freely or stored at a central location because of policy challenges. Thanks to the capability of learning from decentralized datasets, FL is now a rapidly growing research field, and numerous FL frameworks have been developed. In this work, we introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework. APPFL allows users to leverage implemented privacy-preserving algorithms, implement new algorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques. The modular framework enables users to customize the components for algorithms, privacy, communication protocols, neural network models, and user data. We also present a new communication-efficient algorithm based on an inexact alternating direction method of multipliers. The algorithm requires significantly less communication between the server and the clients than does the current state of the art. We demonstrate the computational capabilities of APPFL, including differentially private FL on various test datasets and its scalability, by using multiple algorithms and datasets on different computing environments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源