论文标题

迈向非i.i.d。和FedNAS的无形数据:通过神经体系结构搜索联合深度学习

Towards Non-I.I.D. and Invisible Data with FedNAS: Federated Deep Learning via Neural Architecture Search

论文作者

He, Chaoyang, Annavaram, Murali, Avestimehr, Salman

论文摘要

当由于隐私,沟通成本和监管限制而无法集中数据时,联合学习(FL)已被证明是一个有效的学习框架。当在FL环境下训练深度学习模型时,人们采用了在集中环境中发现的预定义模型架构。但是,这种预定义的体系结构可能不是最佳选择,因为它可能不符合具有非相同和独立分布(非IID)的数据。因此,我们主张自动化联合学习(AUTOFL)以提高模型准确性并减少手动设计工作。我们通过神经体系结构搜索(NAS)专门研究自动FOFL,该搜索可以自动化设计过程。我们提出了一种联合的NAS(FedNAS)算法,以帮助分散的工人协作,以更高的精度寻找更好的体系结构。我们还建立了一个基于FedNas的系统。我们在非IID数据集上的实验表明,FedNAS搜索的架构可以胜过手动预定义的体系结构。

Federated Learning (FL) has been proved to be an effective learning framework when data cannot be centralized due to privacy, communication costs, and regulatory restrictions. When training deep learning models under an FL setting, people employ the predefined model architecture discovered in the centralized environment. However, this predefined architecture may not be the optimal choice because it may not fit data with non-identical and independent distribution (non-IID). Thus, we advocate automating federated learning (AutoFL) to improve model accuracy and reduce the manual design effort. We specifically study AutoFL via Neural Architecture Search (NAS), which can automate the design process. We propose a Federated NAS (FedNAS) algorithm to help scattered workers collaboratively searching for a better architecture with higher accuracy. We also build a system based on FedNAS. Our experiments on non-IID dataset show that the architecture searched by FedNAS can outperform the manually predefined architecture.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源