论文标题
协同自我监督和量化学习
Synergistic Self-supervised and Quantization Learning
论文作者
论文摘要
随着自我监督学习(SSL)的成功,它已成为一种主流范式,可以从自我监督的预计模型中进行微调以提高下游任务的性能。但是,我们发现当前的SSL模型在执行低位量化时遭受严重的准确性下降,禁止其在资源受限应用程序中的部署。在本文中,我们提出了一种称为协同自我监督和量化学习(SSQL)的方法,以预处理量化量化的自我监督模型,从而有助于下游部署。 SSQL以自我监督的方式对比量化和完整的精度模型的特征,在每个步骤中随机选择了量化模型的位宽度。 SSQL不仅可以显着提高量化为较低的位宽度时的准确性,而且在大多数情况下都提高了完整精确模型的准确性。通过仅培训一次,SSQL可以同时在不同的位宽度上受益于各种下游任务。此外,在没有额外的存储开销的情况下,可以实现位宽度的灵活性,在训练和推理过程中只需要一份重量。我们从理论上分析了SSQL的优化过程,并在各种基准上进行详尽的实验,以进一步证明我们方法的有效性。我们的代码可从https://github.com/megvii-research/ssql-eccv2022获得。
With the success of self-supervised learning (SSL), it has become a mainstream paradigm to fine-tune from self-supervised pretrained models to boost the performance on downstream tasks. However, we find that current SSL models suffer severe accuracy drops when performing low-bit quantization, prohibiting their deployment in resource-constrained applications. In this paper, we propose a method called synergistic self-supervised and quantization learning (SSQL) to pretrain quantization-friendly self-supervised models facilitating downstream deployment. SSQL contrasts the features of the quantized and full precision models in a self-supervised fashion, where the bit-width for the quantized model is randomly selected in each step. SSQL not only significantly improves the accuracy when quantized to lower bit-widths, but also boosts the accuracy of full precision models in most cases. By only training once, SSQL can then benefit various downstream tasks at different bit-widths simultaneously. Moreover, the bit-width flexibility is achieved without additional storage overhead, requiring only one copy of weights during training and inference. We theoretically analyze the optimization process of SSQL, and conduct exhaustive experiments on various benchmarks to further demonstrate the effectiveness of our method. Our code is available at https://github.com/megvii-research/SSQL-ECCV2022.