论文标题
在几次学习中为芬特提供动力:域 - 不足的偏置减少选定抽样
Powering Finetuning in Few-Shot Learning: Domain-Agnostic Bias Reduction with Selected Sampling
论文作者
论文摘要
在最近的作品中,利用对元训练集接受过的深层网络,在几次学习中是一个强大的基线。在本文中,我们通过对训练有素的深层网络进行填充,以提高新颖的班级功能。 Finetuning旨在专注于减少新型级特征分布的偏见,我们将其定义为两个方面:类不足和班级特定的偏见。类不足的偏置定义为域差异引入的分布转移,我们建议该分布校准模块(DCM)减少。 DCM欠优化过程中消除域差异和快速特征适应的良好特性。特定于类的偏差定义为使用新颖类中的一些样品的偏置估计,我们建议选择采样(SS)来减少。在不推断实际的类分布的情况下,SS是通过使用围绕支持集样本的建议分布进行采样来设计的。通过使用DCM和SS为芬特提供供应,我们在元数据列上实现了最新的结果,并具有一致的性能提高来自不同域的十个数据集。我们认为,我们简单而有效的方法证明了它可以应用于实际少量应用程序上的可能性。
In recent works, utilizing a deep network trained on meta-training set serves as a strong baseline in few-shot learning. In this paper, we move forward to refine novel-class features by finetuning a trained deep network. Finetuning is designed to focus on reducing biases in novel-class feature distributions, which we define as two aspects: class-agnostic and class-specific biases. Class-agnostic bias is defined as the distribution shifting introduced by domain difference, which we propose Distribution Calibration Module(DCM) to reduce. DCM owes good property of eliminating domain difference and fast feature adaptation during optimization. Class-specific bias is defined as the biased estimation using a few samples in novel classes, which we propose Selected Sampling(SS) to reduce. Without inferring the actual class distribution, SS is designed by running sampling using proposal distributions around support-set samples. By powering finetuning with DCM and SS, we achieve state-of-the-art results on Meta-Dataset with consistent performance boosts over ten datasets from different domains. We believe our simple yet effective method demonstrates its possibility to be applied on practical few-shot applications.