论文标题

在不进行微调的情况下搜索长时间尺度

Searching for long time scales without fine tuning

论文作者

Chen, Xiaowen, Bialek, William

论文摘要

大多数动物和人类行为的发生在时间尺度上比单个神经元的响应时间长得多。在许多情况下,这些长时间尺度从神经元网络中电活动的复发动力学出现是合理的。在线性模型中,时间尺度是由动态矩阵的特征值设置的,该矩阵的特征值测量神经元之间突触连接的强度。目前尚不清楚这些矩阵元素在多大程度上需要调整以生成长时间尺度。在某些情况下,不仅需要一个长时间的规模,而且需要整个范围。从随机对称连接的最简单情况开始,我们将最大熵和随机矩阵理论方法结合起来构建网络的集合,探索长时间尺度成为通用所需的约束。我们认为,单个长期规模可以从现实的约束中产生,但是各种慢速模式需要更多的调整。 Langevin动力学生成从这些集成的突触连接模式的动力学涉及Hebbian学习和活动依赖性突触缩放的组合。

Most of animal and human behavior occurs on time scales much longer than the response times of individual neurons. In many cases, it is plausible that these long time scales emerge from the recurrent dynamics of electrical activity in networks of neurons. In linear models, time scales are set by the eigenvalues of a dynamical matrix whose elements measure the strengths of synaptic connections between neurons. It is not clear to what extent these matrix elements need to be tuned in order to generate long time scales; in some cases, one needs not just a single long time scale but a whole range. Starting from the simplest case of random symmetric connections, we combine maximum entropy and random matrix theory methods to construct ensembles of networks, exploring the constraints required for long time scales to become generic. We argue that a single long time scale can emerge generically from realistic constraints, but a full spectrum of slow modes requires more tuning. Langevin dynamics that generates patterns of synaptic connections drawn from these ensembles involves a combination of Hebbian learning and activity-dependent synaptic scaling.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源