论文标题

学习有条件匹配差异的基于有条件的内核图像分类

Learning Kernel for Conditional Moment-Matching Discrepancy-based Image Classification

论文作者

Ren, Chuan-Xian, Ge, Pengfei, Dai, Dao-Qing, Yan, Hong

论文摘要

有条件的最大平均差异(CMMD)可以通过从非线性内核函数中绘制支持的有条件分布之间的差异,因此已成功地用于模式分类。但是,CMMD在复杂分布上不能很好地工作,尤其是当内核函数无法正确地表征阶级相似性和类间相似性之间的差异时。在本文中,提出了一种新的内核学习方法,以提高CMMD的歧视性能。它可以用深层的网络特征进行迭代操作,从而表示为缩写的KLN。 CMMD损耗和自动编码器(AE)用于学习注入函数。通过考虑复合内核,即具有特征性核的注入函数,CMMD对数据类别描述的有效性得到了增强。 KLN可以同时学习更具表现力的内核和标签预测分布,因此,它可用于改善监督和半监督学习方案中的分类性能。特别是,基于内核的相似性是在深层网络功能上迭代学习的,并且算法可以以端到端的方式实现。在包括MNIST,SVHN,CIFAR-10和CIFAR-100在内的四个基准数据集上进行了广泛的实验。结果表明KLN实现了最新的分类性能。

Conditional Maximum Mean Discrepancy (CMMD) can capture the discrepancy between conditional distributions by drawing support from nonlinear kernel functions, thus it has been successfully used for pattern classification. However, CMMD does not work well on complex distributions, especially when the kernel function fails to correctly characterize the difference between intra-class similarity and inter-class similarity. In this paper, a new kernel learning method is proposed to improve the discrimination performance of CMMD. It can be operated with deep network features iteratively and thus denoted as KLN for abbreviation. The CMMD loss and an auto-encoder (AE) are used to learn an injective function. By considering the compound kernel, i.e., the injective function with a characteristic kernel, the effectiveness of CMMD for data category description is enhanced. KLN can simultaneously learn a more expressive kernel and label prediction distribution, thus, it can be used to improve the classification performance in both supervised and semi-supervised learning scenarios. In particular, the kernel-based similarities are iteratively learned on the deep network features, and the algorithm can be implemented in an end-to-end manner. Extensive experiments are conducted on four benchmark datasets, including MNIST, SVHN, CIFAR-10 and CIFAR-100. The results indicate that KLN achieves state-of-the-art classification performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源