论文标题

随机复发性神经元网络的大偏差方法:参数推断和波动引起的过渡

Large Deviations Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions

论文作者

van Meegen, Alexander, Kühn, Tobias, Helias, Moritz

论文摘要

我们在这里将具有较大偏差理论的神经元网络的现场理论方法统一。对于具有连续值单元的典型随机复发网络模型,我们表明该有效作用与速率函数相同,并使用场理论得出后者。此速率函数采用了kullback-leibler差异的形式,该差异可以实现模型参数的数据驱动推断,并计算均值场理论之外的波动。最后,我们暴露了一个平均场解决方案之间波动诱导的过渡的制度。

We here unify the field theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源