论文标题

类干扰正规化

Class Interference Regularization

论文作者

Munjal, Bharti, Amin, Sikandar, Galasso, Fabio

论文摘要

对比损失会产生重新识别,面部验证和射击学习的最新表现。他们最近在成像网刻度上优于分类的跨透明拷贝损失,并以较大的边距(SIMCLR)优于所有自欺欺人的先验结果。简单有效的正则化技术(例如标签平滑和自我验证)不再适用,因为它们对多项式标签分布作用,在跨侧面损失中采用,而不是元组比较术语,这表征了对比损失。 在这里,我们提出了一种新颖,简单有效的正则化技术,即类干扰正则化(CIR),该技术适用于跨透明损失,但在对比度损失方面尤其有效。 CIR将输出特征随机移动到负类的平均嵌入方式。据我们所知,CIR是第一个针对输出功能的正规化技术。 在实验评估中,CIR和平淡的暹罗网络与三胞胎损失的组合在具有挑战性的tieredimagenet上产生了最佳的几次学习表现。 CIR还根据三重态损失,在市场1501数据集上重新识别的最先进技术以及基于跨肠损失的Cuhk-Sysu数据集的最先进技术。最后,根据CIFAR -10和-100的分类,CIR的任务与流行的标签平滑性相同。

Contrastive losses yield state-of-the-art performance for person re-identification, face verification and few shot learning. They have recently outperformed the cross-entropy loss on classification at the ImageNet scale and outperformed all self-supervision prior results by a large margin (SimCLR). Simple and effective regularization techniques such as label smoothing and self-distillation do not apply anymore, because they act on multinomial label distributions, adopted in cross-entropy losses, and not on tuple comparative terms, which characterize the contrastive losses. Here we propose a novel, simple and effective regularization technique, the Class Interference Regularization (CIR), which applies to cross-entropy losses but is especially effective on contrastive losses. CIR perturbs the output features by randomly moving them towards the average embeddings of the negative classes. To the best of our knowledge, CIR is the first regularization technique to act on the output features. In experimental evaluation, the combination of CIR and a plain Siamese-net with triplet loss yields best few-shot learning performance on the challenging tieredImageNet. CIR also improves the state-of-the-art technique in person re-identification on the Market-1501 dataset, based on triplet loss, and the state-of-the-art technique in person search on the CUHK-SYSU dataset, based on a cross-entropy loss. Finally, on the task of classification CIR performs on par with the popular label smoothing, as demonstrated for CIFAR-10 and -100.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源