论文标题

修剪早期出口网络

Pruning Early Exit Networks

论文作者

Görmez, Alperen, Koyuncu, Erdem

论文摘要

表现良好的深度学习模型通常具有很高的计算成本。在本文中,我们结合了两种试图降低计算成本的方法,同时保持模型性能很高:修剪和提早出口网络。我们评估了修剪早期出口网络的两种方法:(1)立即修剪整个网络,(2)以有序的方式修剪基本网络和其他线性分类器。实验结果表明,一般而言,立即修剪整个网络是更好的策略。但是,以高精度的速度,这两种方法具有相似的性能,这意味着可以将修剪和提前出口的过程分开而不会丧失最佳性。

Deep learning models that perform well often have high computational costs. In this paper, we combine two approaches that try to reduce the computational cost while keeping the model performance high: pruning and early exit networks. We evaluate two approaches of pruning early exit networks: (1) pruning the entire network at once, (2) pruning the base network and additional linear classifiers in an ordered fashion. Experimental results show that pruning the entire network at once is a better strategy in general. However, at high accuracy rates, the two approaches have a similar performance, which implies that the processes of pruning and early exit can be separated without loss of optimality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源