论文标题

焦点:生成模型的面向功能的持续学习

FoCL: Feature-Oriented Continual Learning for Generative Models

论文作者

Lao, Qicheng, Mortazavi, Mehrzad, Tahaei, Marzieh, Dutil, Francis, Fevens, Thomas, Havaei, Mohammad

论文摘要

在本文中,我们为生成模型的持续学习提出了一个一般框架:面向功能的持续学习(focl)。与以前旨在通过在参数空间或图像空间中引入正则化来解决灾难性遗忘问题的作品不同,focl在特征空间中施加正则化。我们在实验中表明,焦点对依次到达任务的分布变化具有更快的适应性,并在任务增量学习中实现了生成模型的最新性能。我们讨论了合并正则化空间的选择,以提高性能的不同用例情景,例如,背景中具有较高可变性的任务。最后,我们介绍了一种健忘措施,该措施公平地评估了模型遭受遗忘的程度。有趣的是,对我们提出的健忘得分的分析也意味着,福克斯往往会减轻对未来任务的忘记。

In this paper, we propose a general framework in continual learning for generative models: Feature-oriented Continual Learning (FoCL). Unlike previous works that aim to solve the catastrophic forgetting problem by introducing regularization in the parameter space or image space, FoCL imposes regularization in the feature space. We show in our experiments that FoCL has faster adaptation to distributional changes in sequentially arriving tasks, and achieves the state-of-the-art performance for generative models in task incremental learning. We discuss choices of combined regularization spaces towards different use case scenarios for boosted performance, e.g., tasks that have high variability in the background. Finally, we introduce a forgetfulness measure that fairly evaluates the degree to which a model suffers from forgetting. Interestingly, the analysis of our proposed forgetfulness score also implies that FoCL tends to have a mitigated forgetting for future tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源