论文标题

特征统计指导的有效过滤器修剪

Feature Statistics Guided Efficient Filter Pruning

论文作者

Li, Hang, Ma, Chen, Xu, Wei, Liu, Xue

论文摘要

具有可靠性能的紧凑型卷积神经网络(CNN)是一项至关重要但具有挑战性的任务,尤其是在将它们部署在现实世界应用程序中时。作为减少CNN尺寸的常见方法,修剪方法根据某些指标(例如$ l1 $ norm)删除了CNN过滤器的一部分。但是,以前的方法几乎无法利用单个特征图中的信息差异和特征图之间的相似性特性。在本文中,我们提出了一种新型的过滤器修剪方法,该方法结合了两种特征图选择:多样性吸引的选择(DFS)和相似性感知的选择(SFS)。 DFS的目的是发现信息多样性低的功能,而SFS删除了与他人具有很高相似之处的功能。我们对公开可用数据集的各种CNN体系结构进行了广泛的经验实验。实验结果表明,我们的模型可获得多达91.6%的参数减少,而跌落83.7%,几乎没有准确的损失。

Building compact convolutional neural networks (CNNs) with reliable performance is a critical but challenging task, especially when deploying them in real-world applications. As a common approach to reduce the size of CNNs, pruning methods delete part of the CNN filters according to some metrics such as $l1$-norm. However, previous methods hardly leverage the information variance in a single feature map and the similarity characteristics among feature maps. In this paper, we propose a novel filter pruning method, which incorporates two kinds of feature map selections: diversity-aware selection (DFS) and similarity-aware selection (SFS). DFS aims to discover features with low information diversity while SFS removes features that have high similarities with others. We conduct extensive empirical experiments with various CNN architectures on publicly available datasets. The experimental results demonstrate that our model obtains up to 91.6% parameter decrease and 83.7% FLOPs reduction with almost no accuracy loss.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源