论文标题
仅带有积极标签的联合学习
Federated Learning with Only Positive Labels
论文作者
论文摘要
我们考虑在联合设置中学习一个多类分类模型,在该设置中,每个用户只能访问与单个类相关的正面数据。结果,在每个联合学习回合中,用户需要本地更新分类器,而无需访问负面类的功能和模型参数。因此,天真地采用常规分散的学习,例如分布式SGD或联邦平均,可能会导致琐碎或极度贫穷的分类器。特别是,对于基于嵌入的分类器,所有类嵌入可能会崩溃到一个点。 为了解决这个问题,我们提出了一个仅使用正面标签的培训的通用框架,即平均与扩张(fedaws)平均(fedaws),在每回合后,服务器在每轮后都会施加几何正规剂,以鼓励在嵌入空间中扩散课程。我们在理论上和经验上都表明,Fedaws几乎可以与用户可以使用负标签的传统学习表现相匹配。我们进一步将提出的方法扩展到具有较大输出空间的设置。
We consider learning a multi-class classification model in the federated setting, where each user has access to the positive data associated with only a single class. As a result, during each federated learning round, the users need to locally update the classifier without having access to the features and the model parameters for the negative classes. Thus, naively employing conventional decentralized learning such as the distributed SGD or Federated Averaging may lead to trivial or extremely poor classifiers. In particular, for the embedding based classifiers, all the class embeddings might collapse to a single point. To address this problem, we propose a generic framework for training with only positive labels, namely Federated Averaging with Spreadout (FedAwS), where the server imposes a geometric regularizer after each round to encourage classes to be spreadout in the embedding space. We show, both theoretically and empirically, that FedAwS can almost match the performance of conventional learning where users have access to negative labels. We further extend the proposed method to the settings with large output spaces.