论文标题

统一和有效的合奏知识蒸馏

Unified and Effective Ensemble Knowledge Distillation

论文作者

Wu, Chuhan, Wu, Fangzhao, Qi, Tao, Huang, Yongfeng

论文摘要

合奏知识蒸馏可以从多个教师模型中提取知识,并将其编码为单个学生模型。许多现有方法仅在标记的数据上学习和提炼学生模型。但是,通常在同一标记的数据上学习教师模型,并且它们的预测与GroudTruth标签具有很高的相关性。因此,他们不能为学生教学的任务标签提供足够的知识。对看不见的未标记数据提炼有可能增强从教师到学生的知识转移。在本文中,我们提出了一种统一和有效的集合知识蒸馏方法,该方法将单个学生模型从标记和未标记数据的教师模型中提取。由于不同的教师在同一样本上可能具有不同的预测正确性,因此,在标记的数据上,我们根据不同教师的正确性对预测进行了加权。此外,我们根据教师合奏的总体预测正确性来加重蒸馏损失,以提炼高质量的知识。在未标记的数据上,没有地面图来评估预测正确性。幸运的是,教师之间的分歧是样本硬度的指示,因此我们根据教师的分歧来强调对重要样本的知识蒸馏而加重蒸馏损失。四个数据集上的大量实验显示了我们提出的集合蒸馏方法的有效性。

Ensemble knowledge distillation can extract knowledge from multiple teacher models and encode it into a single student model. Many existing methods learn and distill the student model on labeled data only. However, the teacher models are usually learned on the same labeled data, and their predictions have high correlations with groudtruth labels. Thus, they cannot provide sufficient knowledge complementary to task labels for student teaching. Distilling on unseen unlabeled data has the potential to enhance the knowledge transfer from the teachers to the student. In this paper, we propose a unified and effective ensemble knowledge distillation method that distills a single student model from an ensemble of teacher models on both labeled and unlabeled data. Since different teachers may have diverse prediction correctness on the same sample, on labeled data we weight the predictions of different teachers according to their correctness. In addition, we weight the distillation loss based on the overall prediction correctness of the teacher ensemble to distill high-quality knowledge. On unlabeled data, there is no groundtruth to evaluate prediction correctness. Fortunately, the disagreement among teachers is an indication of sample hardness, and thereby we weight the distillation loss based on teachers' disagreement to emphasize knowledge distillation on important samples. Extensive experiments on four datasets show the effectiveness of our proposed ensemble distillation method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源