论文标题

一个用于更多:为可推广的REID模型选择可概括的样本

One for More: Selecting Generalizable Samples for Generalizable ReID Model

论文作者

Zhang, Enwei, Jiang, Xinyang, Cheng, Hao, Wu, Ancong, Yu, Fufu, Li, Ke, Guo, Xiaowei, Zheng, Feng, Zheng, Wei-Shi, Sun, Xing

论文摘要

现有人员重新识别(REID)模型的当前培训目标仅确保模型的损失在选定的培训批次上减少,而对批处理以外的样本的性能没有任何疑问。它不可避免地会导致模型过度拟合主要位置的数据(例如,类别不平衡的类别数据,简单的样本或嘈杂的样本)。 %我们称之为更新模型以概括更多数据的样本为可推广的样本。最新的重新采样方法通过设计特定标准来选择训练模型的特定样本来解决问题,从而更多地概括了某些类型的数据(例如,硬样品,尾部数据),这对不一致的现实世界中的REID数据分布而言并不适应。因此,本文不简单地假设哪些样品是可以推广的,而是提出了一个一对一的训练目标,该目标直接采用了选定样本作为损失函数的概括能力,并学习采样器以自动选择可推广的样本。更重要的是,我们提出的一项基于的采样器可以无缝集成到REID训练框架中,该框架能够以端到端的方式同时训练REID模型和采样器。实验结果表明,我们的方法可以有效地改善REID模型训练并提高REID模型的性能。

Current training objectives of existing person Re-IDentification (ReID) models only ensure that the loss of the model decreases on selected training batch, with no regards to the performance on samples outside the batch. It will inevitably cause the model to over-fit the data in the dominant position (e.g., head data in imbalanced class, easy samples or noisy samples). %We call the sample that updates the model towards generalizing on more data a generalizable sample. The latest resampling methods address the issue by designing specific criterion to select specific samples that trains the model generalize more on certain type of data (e.g., hard samples, tail data), which is not adaptive to the inconsistent real world ReID data distributions. Therefore, instead of simply presuming on what samples are generalizable, this paper proposes a one-for-more training objective that directly takes the generalization ability of selected samples as a loss function and learn a sampler to automatically select generalizable samples. More importantly, our proposed one-for-more based sampler can be seamlessly integrated into the ReID training framework which is able to simultaneously train ReID models and the sampler in an end-to-end fashion. The experimental results show that our method can effectively improve the ReID model training and boost the performance of ReID models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源