论文标题

结构化稀疏,并优化组卷积和通道洗牌

Structured Sparsification with Joint Optimization of Group Convolution and Channel Shuffle

论文作者

Zhang, Xin-Yu, Zhao, Kai, Xiao, Taihong, Cheng, Ming-Ming, Yang, Ming-Hsuan

论文摘要

卷积神经网络(CNN)的最新进展通常会带来过多的计算开销和内存足迹的费用。网络压缩旨在通过培训具有可比性能的紧凑型模型来减轻此问题。但是,现有的压缩技术要么需要专门的专家设计,要么以适度的性能下降。在本文中,我们提出了一种新型的结构化稀疏方法,以进行有效的网络压缩。所提出的方法会自动在卷积权重上引起结构化的稀疏性,从而通过高度优化的组卷积促进了压缩模型的实施。我们进一步解决了与可学习的通道洗牌机制的组间通信的问题。提出的方法可以轻松地使用可忽略不计的性能下降来压缩许多网络体系结构。广泛的实验结果和分析表明,我们的方法具有与近期网络压缩对应物具有合理准确性复杂性权衡的竞争性能。

Recent advances in convolutional neural networks(CNNs) usually come with the expense of excessive computational overhead and memory footprint. Network compression aims to alleviate this issue by training compact models with comparable performance. However, existing compression techniques either entail dedicated expert design or compromise with a moderate performance drop. In this paper, we propose a novel structured sparsification method for efficient network compression. The proposed method automatically induces structured sparsity on the convolutional weights, thereby facilitating the implementation of the compressed model with the highly-optimized group convolution. We further address the problem of inter-group communication with a learnable channel shuffle mechanism. The proposed approach can be easily applied to compress many network architectures with a negligible performance drop. Extensive experimental results and analysis demonstrate that our approach gives a competitive performance against the recent network compression counterparts with a sound accuracy-complexity trade-off.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源