论文标题
DBQ:轻型深神经网络的可区分分支机构
DBQ: A Differentiable Branch Quantizer for Lightweight Deep Neural Networks
论文作者
论文摘要
深度神经网络已在各种计算机视觉任务上取得了最先进的表现。但是,由于其较高的计算和存储复杂性,它们在资源受限设备上的部署受到了阻碍。尽管各种复杂性降低技术(例如轻型网络架构设计和参数量化)在降低实施这些网络的成本方面已经成功,但这些方法通常被认为是正交的。实际上,现有的量化技术无法将其成功复制到Mobilenet等轻量级体系结构上。为此,我们提出了一种新型的完全可区分的非均匀量化器,可以无缝地映射到有效的三元点产品发动机上。我们对CIFAR-10,ImageNet和Visual Wake单词数据集进行了全面的实验。拟议的量化器(DBQ)成功地解决了积极量化轻型网络(例如MobilenetV1,MobilenetV2和ShufflenetV2)的艰巨任务。 DBQ通过最少的培训开销来实现最先进的结果,并提供了最佳(帕累托最佳)精度 - 复杂性权衡。
Deep neural networks have achieved state-of-the art performance on various computer vision tasks. However, their deployment on resource-constrained devices has been hindered due to their high computational and storage complexity. While various complexity reduction techniques, such as lightweight network architecture design and parameter quantization, have been successful in reducing the cost of implementing these networks, these methods have often been considered orthogonal. In reality, existing quantization techniques fail to replicate their success on lightweight architectures such as MobileNet. To this end, we present a novel fully differentiable non-uniform quantizer that can be seamlessly mapped onto efficient ternary-based dot product engines. We conduct comprehensive experiments on CIFAR-10, ImageNet, and Visual Wake Words datasets. The proposed quantizer (DBQ) successfully tackles the daunting task of aggressively quantizing lightweight networks such as MobileNetV1, MobileNetV2, and ShuffleNetV2. DBQ achieves state-of-the art results with minimal training overhead and provides the best (pareto-optimal) accuracy-complexity trade-off.