论文标题

关于分类及其融合的对比无监督的代表性学习

About contrastive unsupervised representation learning for classification and its convergence

论文作者

Merad, Ibrahim, Yu, Yiyang, Bacry, Emmanuel, Gaïffas, Stéphane

论文摘要

对比代表学习最近被证明在自学训练中非常有效。这些方法已成功地用于培训编码器,这些编码器对下游分类任务的监督培训进行了相当的性能。一些作品已经开始围绕对比学习建立一个理论框架,在这种学习中可以证明其表现的保证。我们将这些结果的扩展为具有多个负样本和多路分类的培训。此外,我们为对比度训练误差的最小化提供了融合保证,并提供过分兼容的深神经编码器的梯度下降,并提供了一些数值实验,以补充我们的理论发现

Contrastive representation learning has been recently proved to be very efficient for self-supervised training. These methods have been successfully used to train encoders which perform comparably to supervised training on downstream classification tasks. A few works have started to build a theoretical framework around contrastive learning in which guarantees for its performance can be proven. We provide extensions of these results to training with multiple negative samples and for multiway classification. Furthermore, we provide convergence guarantees for the minimization of the contrastive training error with gradient descent of an overparametrized deep neural encoder, and provide some numerical experiments that complement our theoretical findings

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源