论文标题
通过反向传播进行深度尖峰神经网络的暂时尖峰序列学习
Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
论文作者
论文摘要
尖峰神经网络(SNNS)非常适合于时空学习和对节能事件驱动的神经形态处理器的实现。但是,与传统人工神经网络的BP方法相比,现有的SNN错误反向传播(BP)方法缺乏适当的尖峰不连续性处理,并且性能低。此外,通常需要大量的时间步骤才能实现不错的性能,从而导致高潜伏期和基于尖峰的计算无法对深度体系结构。我们提出了一种用于训练深SNN的新型时间尖峰序列学习反向传播(TSSL-BP)方法,该方法分解了两种类型的神经元和内神经元内依赖性的误差反向传播,并导致提高时间学习精度。它通过考虑射击活动的全部或不孤单特征,通过处理每个神经元状态的内部演变来捕获触发活动的全部或无孔特征来捕获神经元间的依赖性。 TSSL-BP在几个步骤的缩短时间窗口内有效地训练深SNN,同时提高了包括CIFAR10在内的各种图像分类数据集的准确性。
Spiking neural networks (SNNs) are well suited for spatio-temporal learning and implementations on energy-efficient event-driven neuromorphic processors. However, existing SNN error backpropagation (BP) methods lack proper handling of spiking discontinuities and suffer from low performance compared with the BP methods for traditional artificial neural networks. In addition, a large number of time steps are typically required to achieve decent performance, leading to high latency and rendering spike-based computation unscalable to deep architectures. We present a novel Temporal Spike Sequence Learning Backpropagation (TSSL-BP) method for training deep SNNs, which breaks down error backpropagation across two types of inter-neuron and intra-neuron dependencies and leads to improved temporal learning precision. It captures inter-neuron dependencies through presynaptic firing times by considering the all-or-none characteristics of firing activities and captures intra-neuron dependencies by handling the internal evolution of each neuronal state in time. TSSL-BP efficiently trains deep SNNs within a much shortened temporal window of a few steps while improving the accuracy for various image classification datasets including CIFAR10.