论文标题
降低了深层神经网络加速器的SoftMax单元
Reduced Softmax Unit for Deep Neural Network Accelerators
论文作者
论文摘要
在处理多级预测问题时,SoftMax激活层是非常流行的深神网络(DNN)组件。但是,在DNN加速器实现中,由于需要计算其每个输入的指数,因此创建了其他复杂性。在此简介中,我们提出了用于加速器激活单元的简化版本,其中只有一个比较单元通过在其输入之间选择最大值来产生分类结果。由于激活函数的性质,我们表明该结果始终与SoftMax层产生的分类相同。
The Softmax activation layer is a very popular Deep Neural Network (DNN) component when dealing with multi-class prediction problems. However, in DNN accelerator implementations it creates additional complexities due to the need for computation of the exponential for each of its inputs. In this brief we propose a simplified version of the activation unit for accelerators, where only a comparator unit produces the classification result, by choosing the maximum among its inputs. Due to the nature of the activation function, we show that this result is always identical to the classification produced by the Softmax layer.