论文标题

FINET:使用细粒度批准化来训练轻巧的神经网络

Finet: Using Fine-grained Batch Normalization to Train Light-weight Neural Networks

论文作者

Luo, Chunjie, Zhan, Jianfeng, Wang, Lei, Gao, Wanling

论文摘要

为了构建轻量级网络,我们提出了一种新的归一化,细粒度的批准化(FBN)。不同于批处理(BN)的批准,这使加权输入的最终总和归一化,FBN将求和的中间状态归一化。我们提出了一个基于FBN的新型轻型网络,称为Finet。在训练时,带有FBN的卷积层可以看作是倒瓶颈机制。 FBN可以在推理时融合到卷积中。融合后,Finet使用具有相同信道宽度的标准卷积,从而提高了推理。在ImageNet分类数据集上,Finet实现了最先进的性能(4300万插脚的准确性为65.706%,而303m flops的精度为73.786%),此外,实验表明,Finet比其他最先进的轻量级网络更有效。

To build light-weight network, we propose a new normalization, Fine-grained Batch Normalization (FBN). Different from Batch Normalization (BN), which normalizes the final summation of the weighted inputs, FBN normalizes the intermediate state of the summation. We propose a novel light-weight network based on FBN, called Finet. At training time, the convolutional layer with FBN can be seen as an inverted bottleneck mechanism. FBN can be fused into convolution at inference time. After fusion, Finet uses the standard convolution with equal channel width, thus makes the inference more efficient. On ImageNet classification dataset, Finet achieves the state-of-art performance (65.706% accuracy with 43M FLOPs, and 73.786% accuracy with 303M FLOPs), Moreover, experiments show that Finet is more efficient than other state-of-art light-weight networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源