论文标题

实例可信度推断了几次学习

Instance Credibility Inference for Few-Shot Learning

论文作者

Wang, Yikai, Xu, Chengming, Liu, Chen, Zhang, Li, Fu, Yanwei

论文摘要

几乎没有射击学习(FSL)旨在识别每个类别的培训数据极为有限的新对象。先前的努力是通过利用元学习范式或在数据增强中的新原则来减轻这一极为数据筛选问题的方法。相比之下,本文提出了一种简单的统计方法,称为实例可信度推理(ICI),以利用未标记实例的分布支持以进行几次学习。具体而言,我们首先使用标有几个示例的线性分类器训练线性分类器,并使用它来推断未标记数据的伪标记。为了衡量每个伪标记的实例的可信度,我们建议通过增加偶然参数的稀疏性来解决另一个线性回归假设,并以其稀疏度对伪标记的实例进行排名。我们选择最值得信赖的伪标记的实例,以及标记的示例,以重新培训线性分类器。此过程已迭代,直到所有未标记的样本都包含在扩展的训练集中,即为未标记的数据库收集伪标签。在两个少量设置下进行的广泛实验表明,我们的简单方法可以在四个广泛使用的少量学习基准数据集上建立新的最先进的方法,包括Miniimagenet,Tieredimagenet,Cifar-fs和Cub。我们的代码可在以下网址找到:https://github.com/yikai-wang/ici-fsl

Few-shot learning (FSL) aims to recognize new objects with extremely limited training data for each category. Previous efforts are made by either leveraging meta-learning paradigm or novel principles in data augmentation to alleviate this extremely data-scarce problem. In contrast, this paper presents a simple statistical approach, dubbed Instance Credibility Inference (ICI) to exploit the distribution support of unlabeled instances for few-shot learning. Specifically, we first train a linear classifier with the labeled few-shot examples and use it to infer the pseudo-labels for the unlabeled data. To measure the credibility of each pseudo-labeled instance, we then propose to solve another linear regression hypothesis by increasing the sparsity of the incidental parameters and rank the pseudo-labeled instances with their sparsity degree. We select the most trustworthy pseudo-labeled instances alongside the labeled examples to re-train the linear classifier. This process is iterated until all the unlabeled samples are included in the expanded training set, i.e. the pseudo-label is converged for unlabeled data pool. Extensive experiments under two few-shot settings show that our simple approach can establish new state-of-the-arts on four widely used few-shot learning benchmark datasets including miniImageNet, tieredImageNet, CIFAR-FS, and CUB. Our code is available at: https://github.com/Yikai-Wang/ICI-FSL

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源