论文标题

RNNP:强大的几次学习方法

RNNP: A Robust Few-Shot Learning Approach

论文作者

Mazumder, Pratik, Singh, Pravendra, Namboodiri, Vinay P.

论文摘要

从一些示例中学习是培训分类器的重要实践方面。各种作品对这一方面进行了很好的研究。但是,所有现有方法都假设提供的少数示例始终正确标记。这是一个有力的假设,尤其是如果人们考虑使用基于人群的标签服务进行标记的当前技术。我们通过提出一种新颖的鲁棒学学习方法来解决这个问题。我们的方法依赖于从几个示例中生成可靠的原型。具体而言,我们的方法通过从每个类的支持示例中产生混合特征来完善类原型。精致的原型有助于更好地对查询图像进行分类。我们的方法可以替换任何几次学习方法的评估阶段,该方法使用最近的基于邻居原型的评估程序来使其可靠。我们评估了我们的标准迷你象征和分层imagenet数据集的方法。我们在几个射击类的支持示例中执行各种标签腐败率的实验。我们比广泛使用的几次学习方法获得了显着改善,这些学习方法在存在标签噪声的情况下遭受了显着的性能变性。我们最终提供了广泛的消融实验来验证我们的方法。

Learning from a few examples is an important practical aspect of training classifiers. Various works have examined this aspect quite well. However, all existing approaches assume that the few examples provided are always correctly labeled. This is a strong assumption, especially if one considers the current techniques for labeling using crowd-based labeling services. We address this issue by proposing a novel robust few-shot learning approach. Our method relies on generating robust prototypes from a set of few examples. Specifically, our method refines the class prototypes by producing hybrid features from the support examples of each class. The refined prototypes help to classify the query images better. Our method can replace the evaluation phase of any few-shot learning method that uses a nearest neighbor prototype-based evaluation procedure to make them robust. We evaluate our method on standard mini-ImageNet and tiered-ImageNet datasets. We perform experiments with various label corruption rates in the support examples of the few-shot classes. We obtain significant improvement over widely used few-shot learning methods that suffer significant performance degeneration in the presence of label noise. We finally provide extensive ablation experiments to validate our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源