论文标题

梯度 - 贝叶斯元学习

Gradient-EM Bayesian Meta-learning

论文作者

Zou, Yayi, Lu, Xiaoqi

论文摘要

贝叶斯元学习可以通过不确定性评估来鲁棒和快速适应新任务。贝叶斯元学习背后的关键思想是层次模型的经验贝叶斯推断。在这项工作中,我们将此框架扩展到包括各种现有方法,然后在基于梯度-EM算法提出我们的变体之前。我们的方法通过避免在元更新步骤中避免反向传播计算来提高计算效率,这对于深层神经网络而言已经筋疲力尽。此外,它通过将其与元更新解耦来为内部上更优化过程提供灵活性。关于正弦回归,很少的图像分类和基于策略的强化学习的实验表明,我们的方法不仅可以通过较小的计算成本实现更好的准确性,而且对不确定性也更强大。

Bayesian meta-learning enables robust and fast adaptation to new tasks with uncertainty assessment. The key idea behind Bayesian meta-learning is empirical Bayes inference of hierarchical model. In this work, we extend this framework to include a variety of existing methods, before proposing our variant based on gradient-EM algorithm. Our method improves computational efficiency by avoiding back-propagation computation in the meta-update step, which is exhausting for deep neural networks. Furthermore, it provides flexibility to the inner-update optimization procedure by decoupling it from meta-update. Experiments on sinusoidal regression, few-shot image classification, and policy-based reinforcement learning show that our method not only achieves better accuracy with less computation cost, but is also more robust to uncertainty.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源