论文标题

超越三胞胎损失:重新识别的人与差异差异 - 成对损失

Beyond Triplet Loss: Person Re-identification with Fine-grained Difference-aware Pairwise Loss

论文作者

Yan, Cheng, Pang, Guansong, Bai, Xiao, Zhou, Jun, Gu, Lin

论文摘要

人重新识别(REID)旨在从多个摄像机的不同观点重新识别人们。捕获细颗粒的外观差异通常是准确的人Reid的关键,因为只有在研究这些细粒度差异时,才能区分许多身份。但是,大多数最先进的人REID方法通常是由于三胞胎损失驱动的,因此无法有效地学习细粒度的特征,因为它们更专注于区分大型外观差异。为了解决这个问题,我们引入了一种新颖的成对损失函数,使REID模型能够通过对微小差异的图像和对大差异图像的有限惩罚进行自适应实施指数式惩罚来学习细粒度的特征。拟议的损失是通用的,可以用作插件来替代三重态损失,以显着增强不同类型的最新方法。四个基准数据集的实验结果表明,拟议的损失大大超过了大量的大量损失函数。而且它还可以显着提高数据效率。

Person Re-IDentification (ReID) aims at re-identifying persons from different viewpoints across multiple cameras. Capturing the fine-grained appearance differences is often the key to accurate person ReID, because many identities can be differentiated only when looking into these fine-grained differences. However, most state-of-the-art person ReID approaches, typically driven by a triplet loss, fail to effectively learn the fine-grained features as they are focused more on differentiating large appearance differences. To address this issue, we introduce a novel pairwise loss function that enables ReID models to learn the fine-grained features by adaptively enforcing an exponential penalization on the images of small differences and a bounded penalization on the images of large differences. The proposed loss is generic and can be used as a plugin to replace the triplet loss to significantly enhance different types of state-of-the-art approaches. Experimental results on four benchmark datasets show that the proposed loss substantially outperforms a number of popular loss functions by large margins; and it also enables significantly improved data efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源