论文标题
CPOT:通过最佳运输的通道修剪
CPOT: Channel Pruning via Optimal Transport
论文作者
论文摘要
深度神经网络(DNNS)的最新进展导致网络参数增长,使DNN在平台上的部署极为困难。因此,已经开发出各种修剪方法来压缩深层网络体系结构并加速推理过程。根据精心设计的过滤器排名标准,大多数现有的通道修剪方法丢弃了不太重要的过滤器。但是,由于深度学习模型的解释性有限,因此很难设计一个适当的排名标准以区分冗余过滤器。为了解决这个充满挑战的问题,我们提出了一种新的技术,可以通过最佳运输(称为CPOT)进行修剪。具体而言,我们在深层模型中为每层通道定位Wasserstein Barycenter,这是最佳传输度量下的一组概率分布的平均值。然后,我们修剪Wasserstein Barycenters位于的冗余信息。最后,我们从经验上证明,对于分类任务,CPOT的表现优于修剪Resnet-20,Resnet-32,Resnet-56和Resnet-1110的最新方法。此外,我们表明所提出的CPOT技术擅长通过在图像到图像翻译任务的更困难的情况下修剪来压缩Stargan模型。
Recent advances in deep neural networks (DNNs) lead to tremendously growing network parameters, making the deployments of DNNs on platforms with limited resources extremely difficult. Therefore, various pruning methods have been developed to compress the deep network architectures and accelerate the inference process. Most of the existing channel pruning methods discard the less important filters according to well-designed filter ranking criteria. However, due to the limited interpretability of deep learning models, designing an appropriate ranking criterion to distinguish redundant filters is difficult. To address such a challenging issue, we propose a new technique of Channel Pruning via Optimal Transport, dubbed CPOT. Specifically, we locate the Wasserstein barycenter for channels of each layer in the deep models, which is the mean of a set of probability distributions under the optimal transport metric. Then, we prune the redundant information located by Wasserstein barycenters. At last, we empirically demonstrate that, for classification tasks, CPOT outperforms the state-of-the-art methods on pruning ResNet-20, ResNet-32, ResNet-56, and ResNet-110. Furthermore, we show that the proposed CPOT technique is good at compressing the StarGAN models by pruning in the more difficult case of image-to-image translation tasks.