论文标题

多层尖峰神经网络的自组织

Self-organization of multi-layer spiking neural networks

论文作者

Raghavan, Guruprasad, Lin, Cong, Thomson, Matt

论文摘要

在早期开发过程中,我们的大脑中的活神经网络自主自动组织成大型,复杂的体系结构,从而产生有组织的功能性有机计算设备。能够在发展中的大脑中形成复杂体系结构的关键机制是整个大脑中神经元活动的时空波动的出现。受这种策略的启发,我们试图有效地自组织,将大量的层数层层成各种架构。为了实现这一目标,我们提出了一个动力学系统形式的模块化工具套件,该套件可以无缝堆叠以组装多层神经网络。动态系统封装了尖峰单元的动力学,它们的间/内部层相互作用以及控制层之间信息流的可塑性规则。我们工具套件的关键特征是(1)在上一层中由活动触发的多层触发的多层自主时空,以及(2)峰值依赖性可塑性(STDP)学习规则,该规则基于连接层中的波浪活动来更新层间连接性。我们的框架导致了各种各样的体系结构的自组织,从多层感知到自动编码器。我们还证明,紧急波动可以自组织尖峰网络体系结构以执行无监督的学习,并且可以与线性分类器结合使用网络,以在MNIST等经典图像数据集上执行分类。从广义上讲,我们的工作表明,学习的动态系统框架可用于自我组织大型计算设备。

Living neural networks in our brains autonomously self-organize into large, complex architectures during early development to result in an organized and functional organic computational device. A key mechanism that enables the formation of complex architecture in the developing brain is the emergence of traveling spatio-temporal waves of neuronal activity across the growing brain. Inspired by this strategy, we attempt to efficiently self-organize large neural networks with an arbitrary number of layers into a wide variety of architectures. To achieve this, we propose a modular tool-kit in the form of a dynamical system that can be seamlessly stacked to assemble multi-layer neural networks. The dynamical system encapsulates the dynamics of spiking units, their inter/intra layer interactions as well as the plasticity rules that control the flow of information between layers. The key features of our tool-kit are (1) autonomous spatio-temporal waves across multiple layers triggered by activity in the preceding layer and (2) Spike-timing dependent plasticity (STDP) learning rules that update the inter-layer connectivity based on wave activity in the connecting layers. Our framework leads to the self-organization of a wide variety of architectures, ranging from multi-layer perceptrons to autoencoders. We also demonstrate that emergent waves can self-organize spiking network architecture to perform unsupervised learning, and networks can be coupled with a linear classifier to perform classification on classic image datasets like MNIST. Broadly, our work shows that a dynamical systems framework for learning can be used to self-organize large computational devices.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源