论文标题

几次学习的特定班级渠道关注

Class-Specific Channel Attention for Few-Shot Learning

论文作者

Chen, Ying-Yu, Hsieh, Jun-Wei, Chang, Ming-Ching

论文摘要

由于其在模型培训中的能力而无需过多的数据,因此很少有射击学习(FSL)吸引了计算机视觉中的越来越多的关注。 FSL具有挑战性,因为训练和测试类别(基础与新颖集)可能大多是多元化。基于传统转移的解决方案旨在将知识从大型训练集中学到的知识转移到目标测试集中是有限的,因为任务分配转移的关键不利影响无法充分解决。在本文中,我们通过结合度量学习和引导注意力的概念来扩展基于转移方法的解决方案。为了更好地利用特征主干提取的特征表示,我们提出了特定于类的通道注意(CSCA)模块,该模块通过分配每个类别的CSCA重量矢量来强调每个类中每个类别的判别通道。与旨在学习全球班级功能的一般注意力模块不同,CSCA模块旨在通过非常有效的计算来学习本地和特定于班级的功能。我们评估了CSCA模块在标准基准测试中的性能,包括Miniimagenet,siered-imagenet,Cifar-FS和Cub-200-200-2011。实验在电感和/跨域设置中进行。我们取得了新的最新结果。

Few-Shot Learning (FSL) has attracted growing attention in computer vision due to its capability in model training without the need for excessive data. FSL is challenging because the training and testing categories (the base vs. novel sets) can be largely diversified. Conventional transfer-based solutions that aim to transfer knowledge learned from large labeled training sets to target testing sets are limited, as critical adverse impacts of the shift in task distribution are not adequately addressed. In this paper, we extend the solution of transfer-based methods by incorporating the concept of metric-learning and channel attention. To better exploit the feature representations extracted by the feature backbone, we propose Class-Specific Channel Attention (CSCA) module, which learns to highlight the discriminative channels in each class by assigning each class one CSCA weight vector. Unlike general attention modules designed to learn global-class features, the CSCA module aims to learn local and class-specific features with very effective computation. We evaluated the performance of the CSCA module on standard benchmarks including miniImagenet, Tiered-ImageNet, CIFAR-FS, and CUB-200-2011. Experiments are performed in inductive and in/cross-domain settings. We achieve new state-of-the-art results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源