论文标题

用于节能学习和数字神经形态加速器中推断的记忆组织

Memory Organization for Energy-Efficient Learning and Inference in Digital Neuromorphic Accelerators

论文作者

Schaefer, Clemens JS, Faley, Patrick, Neftci, Emre O, Joshi, Siddharth

论文摘要

神经形态硬件的能源效率受到存储,访问和更新突触参数的能量的极大影响。过去已经研究了针对节能数字加速器的各种记忆组织的方法,但是,它们并未完全将能源成本封装在系统级别上。为了解决这一缺点并说明各种间接费用,我们将控制器和内存合成不同的编码方案,并从这些合成的块中提取能源成本。此外,我们引入了针对结构化连通性的功能编码,例如卷积层中的连接性。与现有的基于索引的解决方案相比,功能编码可在此类层中实现向后通过和重量更新的能量减少58%。我们表明,对于一个2层尖峰神经网络,训练有素可以保留时空模式,基于位图(PB-BMP)的组织可以更有效地编码稀疏网络。这种编码形式的能源效率提高了1.37倍,以通过范·罗苏姆(Van Rossum)距离衡量的网络保留精度下降4%。

The energy efficiency of neuromorphic hardware is greatly affected by the energy of storing, accessing, and updating synaptic parameters. Various methods of memory organisation targeting energy-efficient digital accelerators have been investigated in the past, however, they do not completely encapsulate the energy costs at a system level. To address this shortcoming and to account for various overheads, we synthesize the controller and memory for different encoding schemes and extract the energy costs from these synthesized blocks. Additionally, we introduce functional encoding for structured connectivity such as the connectivity in convolutional layers. Functional encoding offers a 58% reduction in the energy to implement a backward pass and weight update in such layers compared to existing index-based solutions. We show that for a 2 layer spiking neural network trained to retain a spatio-temporal pattern, bitmap (PB-BMP) based organization can encode the sparser networks more efficiently. This form of encoding delivers a 1.37x improvement in energy efficiency coming at the cost of a 4% degradation in network retention accuracy as measured by the van Rossum distance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源