论文标题

通过双向兼容培训在图像检索中使用双向兼容培训升级隐私模型

Privacy-Preserving Model Upgrades with Bidirectional Compatible Training in Image Retrieval

论文作者

Su, Shupeng, Zhang, Binjie, Ge, Yixiao, Xu, Xuyuan, Wang, Yexin, Yuan, Chun, Shan, Ying

论文摘要

在图像检索中升级隐私模型的任务是在不访问原始画廊图像的情况下快速发展新型号的好处。开创性的工作引入了向后兼容的培训,可以在其中直接以无回填方式部署新模型,即可以将新查询直接与旧画廊功能进行比较。尽管有可能的解决方案,但其在顺序模型升级中的改进受到固定和质量不足的旧画廊的嵌入逐渐限制。为此,我们提出了一种新的模型升级范式,称为双向兼容培训(BICT),该培训将通过向前兼容的培训升级旧画廊嵌入,以升级向后兼容的新模型的嵌入空间。我们进行了全面的实验,以验证BICT的显着改善,并有趣地观察到,向后兼容的不一致的减肥重量实际上对向后和前进的检索表现起着至关重要的作用。总而言之,我们通过适当的解决方案介绍了一个名为“隐私的模型升级”的新的且有价值的问题。进一步提出了一些有趣的见解,以充分利用我们的方法。

The task of privacy-preserving model upgrades in image retrieval desires to reap the benefits of rapidly evolving new models without accessing the raw gallery images. A pioneering work introduced backward-compatible training, where the new model can be directly deployed in a backfill-free manner, i.e., the new query can be directly compared to the old gallery features. Despite a possible solution, its improvement in sequential model upgrades is gradually limited by the fixed and under-quality old gallery embeddings. To this end, we propose a new model upgrade paradigm, termed Bidirectional Compatible Training (BiCT), which will upgrade the old gallery embeddings by forward-compatible training towards the embedding space of the backward-compatible new model. We conduct comprehensive experiments to verify the prominent improvement by BiCT and interestingly observe that the inconspicuous loss weight of backward compatibility actually plays an essential role for both backward and forward retrieval performance. To summarize, we introduce a new and valuable problem named privacy-preserving model upgrades, with a proper solution BiCT. Several intriguing insights are further proposed to get the most out of our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源