论文标题

解决灾难性遗忘的问题

Addressing Catastrophic Forgetting in Few-Shot Problems

论文作者

Yap, Pauching, Ritter, Hippolyt, Barber, David

论文摘要

众所周知,神经网络在连续数据集中受过训练时会遭受灾难性遗忘。尽管在大规模监督分类中解决了这一问题,但几乎没有做出几乎没有造成的分类问题来克服灾难性遗忘。我们证明,基于流行的基于梯度的模型远程元学习算法(MAML)确实遭受了灾难性的遗忘,并引入了解决此问题的贝叶斯在线元学习框架。我们的框架利用贝叶斯在线学习和元学习以及拉普拉斯的近似和变异推理来克服几次分类问题的灾难性遗忘。实验评估表明,与各种基线相比,我们的框架可以有效地实现此目标。作为另一个实用程序,我们还从经验上证明,我们的框架能够在固定任务分布中依次到达几个射击任务。

Neural networks are known to suffer from catastrophic forgetting when trained on sequential datasets. While there have been numerous attempts to solve this problem in large-scale supervised classification, little has been done to overcome catastrophic forgetting in few-shot classification problems. We demonstrate that the popular gradient-based model-agnostic meta-learning algorithm (MAML) indeed suffers from catastrophic forgetting and introduce a Bayesian online meta-learning framework that tackles this problem. Our framework utilises Bayesian online learning and meta-learning along with Laplace approximation and variational inference to overcome catastrophic forgetting in few-shot classification problems. The experimental evaluations demonstrate that our framework can effectively achieve this goal in comparison with various baselines. As an additional utility, we also demonstrate empirically that our framework is capable of meta-learning on sequentially arriving few-shot tasks from a stationary task distribution.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源