论文标题

模型融合与kullback-leibler Divergence

Model Fusion with Kullback--Leibler Divergence

论文作者

Claici, Sebastian, Yurochkin, Mikhail, Ghosh, Soumya, Solomon, Justin

论文摘要

我们提出了一种融合从异质数据集中学到的后验分布的方法。我们的算法依赖于融合模型和单个数据集后代的平均字段假设,并使用简单的分配和平均方法进行进行。通过求解分配问题的正则变体,将数据集后期的组件分配给了所提出的全局模型组件。然后,根据这些任务,将通过kl差异下的平均值来更新全局组件。对于指数族的变异分布,我们的公式会导致用于计算融合模型的有效的非参数算法。我们的算法易于描述和实施,高效和竞争,并在运动捕获分析,主题建模和联合贝叶斯神经网络的联合学习方面具有最先进的作用。

We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average approach. The components of the dataset posteriors are assigned to the proposed global model components by solving a regularized variant of the assignment problem. The global components are then updated based on these assignments by their mean under a KL divergence. For exponential family variational distributions, our formulation leads to an efficient non-parametric algorithm for computing the fused model. Our algorithm is easy to describe and implement, efficient, and competitive with state-of-the-art on motion capture analysis, topic modeling, and federated learning of Bayesian neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源