论文标题

用于联邦贝叶斯学习的客户选择

Client Selection for Federated Bayesian Learning

论文作者

Yang, Jiarong, Liu, Yuan, Kassab, Rahif

论文摘要

分布式Stein变异梯度下降(DSVGD)是一个非参数分布式学习框架,用于联合贝叶斯学习,其中多个客户通过传达许多非随机学习模型并与服务器进行交互的粒子来共同训练机器学习模型。由于沟通资源有限,因此选择具有最有用的本地学习更新的客户可以提高模型收敛性和沟通效率。在本文中,我们提出了基于二型Stein差异(KSD)和Hilbert Inner产品(HIP)的DSVGD的两种选择方案。对于这两种方案,我们都会根据迭代的全球自由能的减小来得出上限,然后将其最小化以加快模型收敛性。我们使用各种学习任务和数据集评估并将我们的方案与常规方案进行比较。

Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametric distributed learning framework for federated Bayesian learning, where multiple clients jointly train a machine learning model by communicating a number of non-random and interacting particles with the server. Since communication resources are limited, selecting the clients with most informative local learning updates can improve the model convergence and communication efficiency. In this paper, we propose two selection schemes for DSVGD based on Kernelized Stein Discrepancy (KSD) and Hilbert Inner Product (HIP). We derive the upper bound on the decrease of the global free energy per iteration for both schemes, which is then minimized to speed up the model convergence. We evaluate and compare our schemes with conventional schemes in terms of model accuracy, convergence speed, and stability using various learning tasks and datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源