论文标题

为新颖的少量分类

Novelty-Prepared Few-Shot Classification

论文作者

Wang, Chao, Liu, Ruo-Ze, Ye, Han-Jia, Yu, Yang

论文摘要

很少有射击分类算法可以减轻数据稀缺问题,这在许多现实世界中至关重要,它通过采用从其他域中大量数据中预先训练的模型。但是,训练过程通常不知道未来对其他概念类别的适应。我们透露,经典训练的功能提取器几乎无法为看不见的课程留下几乎没有嵌入的空间,从而使该模型无法适应新的课程。在这项工作中,我们建议使用新颖的损失函数,称为自我压缩软效果损失(SSL),以进行几次分类。 SSL可以防止嵌入空间的全部占用。因此,该模型更准备好学习新课程。在Cub-200-2011和Mini-Imagenet数据集的实验中,我们表明SSL可显着改善最先进的性能。这项工作可能会阐明考虑几次射击分类任务的模型容量。

Few-shot classification algorithms can alleviate the data scarceness issue, which is vital in many real-world problems, by adopting models pre-trained from abundant data in other domains. However, the pre-training process was commonly unaware of the future adaptation to other concept classes. We disclose that a classically fully trained feature extractor can leave little embedding space for unseen classes, which keeps the model from well-fitting the new classes. In this work, we propose to use a novelty-prepared loss function, called self-compacting softmax loss (SSL), for few-shot classification. The SSL can prevent the full occupancy of the embedding space. Thus the model is more prepared to learn new classes. In experiments on CUB-200-2011 and mini-ImageNet datasets, we show that SSL leads to significant improvement of the state-of-the-art performance. This work may shed some light on considering the model capacity for few-shot classification tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源