论文标题

在学习过程中,RNN中生物启发的自适应神经激活的优势

Advantages of biologically-inspired adaptive neural activation in RNNs during learning

论文作者

Geadah, Victor, Kerg, Giancarlo, Horoi, Stefan, Wolf, Guy, Lajoie, Guillaume

论文摘要

单神经元反应中的动态适应性在生物神经网络中的神经编码中起着基本作用。然而,人工网络中使用的大多数神经激活功能都是固定的,并且主要被视为无关紧要的架构选择。在本文中,我们研究了在较大的学习时间范围内的非线性激活函数的适应性,并概述了其对复发神经网络中顺序处理的影响。我们引入了一个新型的非线性激活函数的参数家族,灵感来自生物神经元的输入频率响应曲线,该曲线允许在众所周知的激活函数(例如Relu和Sigmoid)之间进行插值。使用动态系统和信息理论中的简单数值实验和工具,我们研究了神经激活特征在学习动力学中的作用。我们发现,激活适应提供了不同的特定任务解决方案,在某些情况下,可以提高学习速度和性能。重要的是,我们发现从我们的参数家族中出现的最佳激活特征与文献中使用的典型功能有很大不同,这表明利用这些常规配置之间的差距可以帮助学习。最后,我们概述了单独的神经激活适应性可能有助于减轻给定任务中输入统计信息的变化,这表明了转移学习优化的机制。

Dynamic adaptation in single-neuron response plays a fundamental role in neural coding in biological neural networks. Yet, most neural activation functions used in artificial networks are fixed and mostly considered as an inconsequential architecture choice. In this paper, we investigate nonlinear activation function adaptation over the large time scale of learning, and outline its impact on sequential processing in recurrent neural networks. We introduce a novel parametric family of nonlinear activation functions, inspired by input-frequency response curves of biological neurons, which allows interpolation between well-known activation functions such as ReLU and sigmoid. Using simple numerical experiments and tools from dynamical systems and information theory, we study the role of neural activation features in learning dynamics. We find that activation adaptation provides distinct task-specific solutions and in some cases, improves both learning speed and performance. Importantly, we find that optimal activation features emerging from our parametric family are considerably different from typical functions used in the literature, suggesting that exploiting the gap between these usual configurations can help learning. Finally, we outline situations where neural activation adaptation alone may help mitigate changes in input statistics in a given task, suggesting mechanisms for transfer learning optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源