论文标题
用细粒度图像,少量图像和图形分类中的功率正常化
Power Normalizations in Fine-grained Image, Few-shot Image and Graph Classification
论文作者
论文摘要
功率正常化(PN)是有用的非线性操作员,可以解决分类问题中的失衡。我们通过新型的PN层池特征图在深度学习设置中研究PNS。我们的层将CNN最后一个卷积层产生的特征图中的特征向量及其各自的空间位置结合在一起,并将其应用于二阶统计量的正定义矩阵中,将PN运算符应用到这些统计范围内,形成所谓的二阶池(SOP)。由于本文的主要目标是研究功率正常化,因此我们研究了两个流行的PN功能的MaxExp和Gamma的作用和含义。为此,我们提供了对这种元素操作员的概率解释,并发现具有端对端培训的衍生物具有良好的衍生物的代理人。此外,我们通过研究光谱功率正常化(SPN)来研究MaxExp和Gamma的光谱适用性。我们表明,在图laplacian矩阵上的自相关/协方差矩阵和热扩散过程(HDP)上的SPN密切相关,从而共享其属性。这样的发现使我们达到了工作的高潮,这是一种快速的光谱MaxExp,它是协方差/自相关矩阵的HDP的变体。我们评估了有关细粒度识别,场景识别和材料分类的想法,以及几乎没有学习和图形分类。
Power Normalizations (PN) are useful non-linear operators which tackle feature imbalances in classification problems. We study PNs in the deep learning setup via a novel PN layer pooling feature maps. Our layer combines the feature vectors and their respective spatial locations in the feature maps produced by the last convolutional layer of CNN into a positive definite matrix with second-order statistics to which PN operators are applied, forming so-called Second-order Pooling (SOP). As the main goal of this paper is to study Power Normalizations, we investigate the role and meaning of MaxExp and Gamma, two popular PN functions. To this end, we provide probabilistic interpretations of such element-wise operators and discover surrogates with well-behaved derivatives for end-to-end training. Furthermore, we look at the spectral applicability of MaxExp and Gamma by studying Spectral Power Normalizations (SPN). We show that SPN on the autocorrelation/covariance matrix and the Heat Diffusion Process (HDP) on a graph Laplacian matrix are closely related, thus sharing their properties. Such a finding leads us to the culmination of our work, a fast spectral MaxExp which is a variant of HDP for covariances/autocorrelation matrices. We evaluate our ideas on fine-grained recognition, scene recognition, and material classification, as well as in few-shot learning and graph classification.