论文标题

基于错误校正的基于边距的深层模式散列,用于面部图像检索

Error-Corrected Margin-Based Deep Cross-Modal Hashing for Facial Image Retrieval

论文作者

Taherkhani, Fariborz, Talreja, Veeru, Valenti, Matthew C., Nasrabadi, Nasser M.

论文摘要

跨模式散列有助于将异质的多媒体数据映射到一个通用的锤子空间中,可以在不同的方式上进行快速和灵活的检索。在本文中,我们提出了一种新型的跨模式Hashingharchitucture深入神经解码器交叉模式哈希(DNDCMH),它使用二元矢量,指定存在某些种族属性作为输入查询来检索数据库中相关面部图像。 DNDCMH网络由两个分隔组件组成:一个基于属性的深层模式哈希(ADCMH)模块,该模块使用边缘(M)基于边缘(M)的损耗函数,从而有效地学习紧凑型二进制代码来保留锤击空间中的模态和神经错误校正网络(NECD)的误差(necd),一个错误(NECD)的纠正措施是一个错误的纠正。 DNDCMH ISTO错误中NECD网络的目标纠正了ADCMH生成的哈希码以提高检索效率。对NECD网络进行了训练,以使其Hasan误差纠正能力大于或等于基于边距损耗函数的边距(M)。这导致NECD取消了ADCMH生成的损坏的哈希码,直到M的锤距。我们已经评估并比较了标准数据集上的最新交叉模式哈希方法,以证明我们方法的优越性。

Cross-modal hashing facilitates mapping of heterogeneous multimedia data into a common Hamming space, which can beutilized for fast and flexible retrieval across different modalities. In this paper, we propose a novel cross-modal hashingarchitecture-deep neural decoder cross-modal hashing (DNDCMH), which uses a binary vector specifying the presence of certainfacial attributes as an input query to retrieve relevant face images from a database. The DNDCMH network consists of two separatecomponents: an attribute-based deep cross-modal hashing (ADCMH) module, which uses a margin (m)-based loss function toefficiently learn compact binary codes to preserve similarity between modalities in the Hamming space, and a neural error correctingdecoder (NECD), which is an error correcting decoder implemented with a neural network. The goal of NECD network in DNDCMH isto error correct the hash codes generated by ADCMH to improve the retrieval efficiency. The NECD network is trained such that it hasan error correcting capability greater than or equal to the margin (m) of the margin-based loss function. This results in NECD cancorrect the corrupted hash codes generated by ADCMH up to the Hamming distance of m. We have evaluated and comparedDNDCMH with state-of-the-art cross-modal hashing methods on standard datasets to demonstrate the superiority of our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源