论文标题

BS4NN:具有时间编码和学习的二进制尖峰神经网络

BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning

论文作者

Kheradpisheh, Saeed Reza, Mirsadeghi, Maryam, Masquelier, Timothée

论文摘要

我们最近提出了S4NN算法,基本上是对多层尖峰神经网络的反向传播的改编,这些神经网络使用简单的非透明整合神经元,以及一种被称为第一个跨度编码时间的时间编码的形式。通过这种编码方案,神经元最多每次刺激一次发射一次,但是发射顺序带有信息。在这里,我们介绍了BS4NN,这是S4NN的修改,其中突触权重约束为二进制(+1或-1),以减少记忆(理想情况下,每次突触)和计算足迹。这是使用两组重量来完成的:首先,实现的权重,通过梯度下降进行更新,并用于反向传播的向后通过,其次,它们的符号在前pass中使用。已经使用类似的策略来训练(非加速)双核神经网络。主要区别是BS4NN在时域运行:峰值依次传播,并且不同的神经元可能在不同时间达到其阈值,从而增加了计算能力。我们在两个受欢迎的基准测试基准和时尚界的基准上验证了BS4NN,并获得了这种网络的合理精确度(分别为97.0%和87.3%),而相对于现实价值的权重分别为0.4%和0.7%),精确度却忽略不计。我们还证明,BS4NN在这两个数据集(分别为0.2%和0.9%)上胜过具有相同体系结构的简单BNN,大概是因为它利用了时间维度。拟议的BS4NN的源代码可在https://github.com/srkh/bs4nn上公开获得。

We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding. With this coding scheme, neurons fire at most once per stimulus, but the firing order carries information. Here, we introduce BS4NN, a modification of S4NN in which the synaptic weights are constrained to be binary (+1 or -1), in order to decrease memory (ideally, one bit per synapse) and computation footprints. This was done using two sets of weights: firstly, real-valued weights, updated by gradient descent, and used in the backward pass of backpropagation, and secondly, their signs, used in the forward pass. Similar strategies have been used to train (non-spiking) binarized neural networks. The main difference is that BS4NN operates in the time domain: spikes are propagated sequentially, and different neurons may reach their threshold at different times, which increases computational power. We validated BS4NN on two popular benchmarks, MNIST and Fashion-MNIST, and obtained reasonable accuracies for this sort of network (97.0% and 87.3% respectively) with a negligible accuracy drop with respect to real-valued weights (0.4% and 0.7%, respectively). We also demonstrated that BS4NN outperforms a simple BNN with the same architectures on those two datasets (by 0.2% and 0.9% respectively), presumably because it leverages the temporal dimension. The source codes of the proposed BS4NN are publicly available at https://github.com/SRKH/BS4NN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源