论文标题

几乎没有班级学习

Few-Shot Class-Incremental Learning

论文作者

Tao, Xiaoyu, Hong, Xiaopeng, Chang, Xinyuan, Dong, Songlin, Wei, Xing, Gong, Yihong

论文摘要

逐步学习新课程的能力对于实际人工智能系统的发展至关重要。在本文中,我们专注于一个具有挑战性但实用的几个阶级学习(FSCIL)问题。 FSCIL要求CNN模型可以从很少的标签样本中逐步学习新类,而不会忘记先前学到的样本。为了解决这个问题,我们使用神经气体(NG)网络表示知识,该网络可以学习并保留由不同类别形成的特征歧管的拓扑。在此基础上,我们建议提供拓扑的知识增量器(主题)框架。主题通过稳定NG的拓扑结构来减轻对旧课程的遗忘,并通过成长和适应新培训样本来改善几乎没有新课程的代表性学习。全面的实验结果表明,我们提出的方法显着优于CIFAR100,Miniimagenet和Cub200数据集的其他最先进的课堂学习方法。

The ability to incrementally learn new classes is crucial to the development of real-world artificial intelligence systems. In this paper, we focus on a challenging but practical few-shot class-incremental learning (FSCIL) problem. FSCIL requires CNN models to incrementally learn new classes from very few labelled samples, without forgetting the previously learned ones. To address this problem, we represent the knowledge using a neural gas (NG) network, which can learn and preserve the topology of the feature manifold formed by different classes. On this basis, we propose the TOpology-Preserving knowledge InCrementer (TOPIC) framework. TOPIC mitigates the forgetting of the old classes by stabilizing NG's topology and improves the representation learning for few-shot new classes by growing and adapting NG to new training samples. Comprehensive experimental results demonstrate that our proposed method significantly outperforms other state-of-the-art class-incremental learning methods on CIFAR100, miniImageNet, and CUB200 datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源