论文标题

专家培训:任务硬度了解元学习的元学习

Expert Training: Task Hardness Aware Meta-Learning for Few-Shot Classification

论文作者

Zhou, Yucan, Wang, Yu, Cai, Jianfei, Zhou, Yu, Hu, Qinghua, Wang, Weiping

论文摘要

当有大量标记的样品可用但失败的情况下,深度神经网络非常有效。最近,元学习方法受到了很多关注,该方法培训了一名元学习者,以实现大量的其他任务,以获取知识以指导少量射击分类。通常,对训练任务进行随机采样并进行不加区别,通常会使元学习者陷入不良的局部最佳最佳。深度神经网络优化的一些作品表明,更好的培训数据排列可以使分类器收敛速度更快并表现更好。受这个想法的启发,我们提出了一种易于匹配的专家元训练策略,以正确安排培训任务,在第一阶段首选简单的任务,然后在第二阶段强调艰巨的任务。任务硬度意识到的模块被设计并集成到培训过程中,以根据其类别的区分性来估计任务的硬度。此外,我们探索了多次硬度测量值,包括语义关系,成对的欧几里得距离,Hausdorff距离和Hilbert-Schmidt独立标准。迷你胶原和Tieredimagenetsketch数据集的实验结果表明,元学习者可以通过我们的专家培训策略获得更好的结果。

Deep neural networks are highly effective when a large number of labeled samples are available but fail with few-shot classification tasks. Recently, meta-learning methods have received much attention, which train a meta-learner on massive additional tasks to gain the knowledge to instruct the few-shot classification. Usually, the training tasks are randomly sampled and performed indiscriminately, often making the meta-learner stuck into a bad local optimum. Some works in the optimization of deep neural networks have shown that a better arrangement of training data can make the classifier converge faster and perform better. Inspired by this idea, we propose an easy-to-hard expert meta-training strategy to arrange the training tasks properly, where easy tasks are preferred in the first phase, then, hard tasks are emphasized in the second phase. A task hardness aware module is designed and integrated into the training procedure to estimate the hardness of a task based on the distinguishability of its categories. In addition, we explore multiple hardness measurements including the semantic relation, the pairwise Euclidean distance, the Hausdorff distance, and the Hilbert-Schmidt independence criterion. Experimental results on the miniImageNet and tieredImageNetSketch datasets show that the meta-learners can obtain better results with our expert training strategy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源