论文标题

基于时间连续性的人的无监督学习重新识别

Temporal Continuity Based Unsupervised Learning for Person Re-Identification

论文作者

Ali, Usman, Bayramli, Bayram, Lu, Hongtao

论文摘要

人重新识别(RE-ID)的目的是通过跨多个相机拍摄的图像与同一个人相匹配。大多数现有的人重新ID方法通常需要大量标记数据的数据,以充当代表性学习的歧视指南。在实际情况下,手动收集标记数据的身份的困难导致适应性差。为了克服这个问题,我们提出了一种无监督的基于中心的聚类方法,能够从相机内的时间连续性中逐步学习和利用基本的重新ID判别信息。我们称我们的框架基于无监督学习(TCUL)为基础。具体而言,TCUL同时对未标记(目标)数据集的中心聚类和微调进行了卷积神经网络(CNN),以对无关标记的(源)数据集进行了预先培训,以增强目标数据集的CNN的歧视能力。此外,它与特征地图跨面膜的空间相似性相机内部相机内的图像的时间连续性开发,以生成可靠的伪标记,以训练重新识别模型。随着培训的进行,可靠的样本数量不断增长,从而提高了CNN的表示能力。进行了三个大规模重新ID基准数据集的广泛实验,以将我们的框架与最先进的技术进行比较,这表明TCUL优于现有方法。

Person re-identification (re-id) aims to match the same person from images taken across multiple cameras. Most existing person re-id methods generally require a large amount of identity labeled data to act as discriminative guideline for representation learning. Difficulty in manually collecting identity labeled data leads to poor adaptability in practical scenarios. To overcome this problem, we propose an unsupervised center-based clustering approach capable of progressively learning and exploiting the underlying re-id discriminative information from temporal continuity within a camera. We call our framework Temporal Continuity based Unsupervised Learning (TCUL). Specifically, TCUL simultaneously does center based clustering of unlabeled (target) dataset and fine-tunes a convolutional neural network (CNN) pre-trained on irrelevant labeled (source) dataset to enhance discriminative capability of the CNN for the target dataset. Furthermore, it exploits temporally continuous nature of images within-camera jointly with spatial similarity of feature maps across-cameras to generate reliable pseudo-labels for training a re-identification model. As the training progresses, number of reliable samples keep on growing adaptively which in turn boosts representation ability of the CNN. Extensive experiments on three large-scale person re-id benchmark datasets are conducted to compare our framework with state-of-the-art techniques, which demonstrate superiority of TCUL over existing methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源