论文标题

学习知识表示,具有元知识蒸馏的单图像超分辨率

Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution

论文作者

Zhu, Han, Chen, Zhenzhong, Liu, Shan

论文摘要

知识蒸馏(KD)可以有效地将知识从繁琐的网络(教师)转移到紧凑的网络(学生),在某些计算机视觉应用中证明了其优势。知识的表示对于知识转移和学生学习至关重要,这通常是用手工制作的方式定义的或直接使用中间特征。在本文中,我们在教师学生的体系结构下针对单图超分辨率任务提出了一种模型 - 反应式元知识蒸馏方法。它提供了一种更灵活,更准确的方法,可以通过知识代表网络(KRNET)的能力来帮助教师通过具有可学习参数的知识传输知识。为了提高知识表示对学生需求的看法能力,我们建议通过采用学生特征以及KRNET的教师和学生之间的相关性来解决从中间产出到转移知识的转变过程。具体而言,生成了纹理感知的动态内核,然后提取要改进的纹理特征,并提取相应的教师指导,以将蒸馏问题分解为质地的监督,以进一步促进高频详细信息的恢复质量。此外,KRNET以元学习的方式进行了优化,以确保知识转移和学生学习有益于提高学生的重建质量。在各种单个图像超级分辨率数据集上进行的实验表明,我们所提出的方法的表现优于现有的定义知识表示相关的蒸馏方法,并且可以帮助超分辨率算法实现更好的重建质量,而无需引入任何推理复杂性。

Knowledge distillation (KD), which can efficiently transfer knowledge from a cumbersome network (teacher) to a compact network (student), has demonstrated its advantages in some computer vision applications. The representation of knowledge is vital for knowledge transferring and student learning, which is generally defined in hand-crafted manners or uses the intermediate features directly. In this paper, we propose a model-agnostic meta knowledge distillation method under the teacher-student architecture for the single image super-resolution task. It provides a more flexible and accurate way to help the teachers transmit knowledge in accordance with the abilities of students via knowledge representation networks (KRNets) with learnable parameters. In order to improve the perception ability of knowledge representation to students' requirements, we propose to solve the transformation process from intermediate outputs to transferred knowledge by employing the student features and the correlation between teacher and student in the KRNets. Specifically, the texture-aware dynamic kernels are generated and then extract texture features to be improved and the corresponding teacher guidance so as to decompose the distillation problem into texture-wise supervision for further promoting the recovery quality of high-frequency details. In addition, the KRNets are optimized in a meta-learning manner to ensure the knowledge transferring and the student learning are beneficial to improving the reconstructed quality of the student. Experiments conducted on various single image super-resolution datasets demonstrate that our proposed method outperforms existing defined knowledge representation related distillation methods, and can help super-resolution algorithms achieve better reconstruction quality without introducing any inference complexity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源