论文标题
使用一些示例的认知启发模型用于增量学习
Cognitively-Inspired Model for Incremental Learning Using a Few Examples
论文作者
论文摘要
增量学习尝试开发分类器,该分类器从分离为不同类别的数据流中不断学习。深度学习方法逐渐学习班级时会遭受灾难性遗忘,而大多数渐进学习方法则需要大量的每个课程培训数据。我们仅使用几个培训示例来研究增量学习的问题,即少量递增学习(FSIL)。为了解决这个问题,我们提出了一种新的方法,灵感来自海马和新皮层的概念学习模型,该模型代表每个图像类是质心,并且不会遭受灾难性遗忘。我们在三个课程收入学习基准中评估了我们的方法:Caltech-101,Cubs-200-200-2011和CIFAR-100,用于增量和少量增量学习,并表明我们的方法在所有学习类别的分类准确性方面取得了最先进的能力。
Incremental learning attempts to develop a classifier which learns continuously from a stream of data segregated into different classes. Deep learning approaches suffer from catastrophic forgetting when learning classes incrementally, while most incremental learning approaches require a large amount of training data per class. We examine the problem of incremental learning using only a few training examples, referred to as Few-Shot Incremental Learning (FSIL). To solve this problem, we propose a novel approach inspired by the concept learning model of the hippocampus and the neocortex that represents each image class as centroids and does not suffer from catastrophic forgetting. We evaluate our approach on three class-incremental learning benchmarks: Caltech-101, CUBS-200-2011 and CIFAR-100 for incremental and few-shot incremental learning and show that our approach achieves state-of-the-art results in terms of classification accuracy over all learned classes.