论文标题

学会同步,同步学习

Learn to Synchronize, Synchronize to Learn

论文作者

Verzelli, Pietro, Alippi, Cesare, Livi, Lorenzo

论文摘要

近年来,机器学习社区已经看到了对研究的持续不断的兴趣,旨在研究培训程序和机器学习模型的动态方面。在复发性神经网络中,我们特别感兴趣的是,我们具有以概念简单性和快速训练方案为特征的储层计算(RC)范式。然而,RC运行的指导原则仅部分理解。在这项工作中,我们分析了训练RC解决通用任务时通过广义同步(GS)发挥的作用。特别是,我们展示了GS如何允许储层正确编码将输入信号生成其动力学的系统。我们还讨论了在这种方法中学习可行的必要条件。此外,我们探讨了麦迪奇性在此过程中所起的作用,显示其存在如何允许学习结果应用于多个输入轨迹。最后,我们表明,可以通过共同的假邻居指数来衡量GS的满意度,这对从业者的理论推导有效。

In recent years, the machine learning community has seen a continuous growing interest in research aimed at investigating dynamical aspects of both training procedures and machine learning models. Of particular interest among recurrent neural networks we have the Reservoir Computing (RC) paradigm characterized by conceptual simplicity and a fast training scheme. Yet, the guiding principles under which RC operates are only partially understood. In this work, we analyze the role played by Generalized Synchronization (GS) when training a RC to solve a generic task. In particular, we show how GS allows the reservoir to correctly encode the system generating the input signal into its dynamics. We also discuss necessary and sufficient conditions for the learning to be feasible in this approach. Moreover, we explore the role that ergodicity plays in this process, showing how its presence allows the learning outcome to apply to multiple input trajectories. Finally, we show that satisfaction of the GS can be measured by means of the Mutual False Nearest Neighbors index, which makes effective to practitioners theoretical derivations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源