论文标题

有监督的感知者学习与无监督的Hebbian学习:在类似Hopfield的网络中接近最佳内存检索

Supervised perceptron learning vs unsupervised Hebbian unlearning: Approaching optimal memory retrieval in Hopfield-like networks

论文作者

Benedetti, Marco, Ventura, Enrico, Marinari, Enzo, Ruocco, Giancarlo, Zamponi, Francesco

论文摘要

HEBBIAN UNERANNING算法,即用于改善Hopfield类神经网络中检索特性的无监督局部程序,在数值上与受监督的算法进行比较,以训练线性对称性感知器。我们分析了存储的记忆的稳定性:Hebbian未经学习技术获得的吸引力盆地的大小与在对称感知者中获得的吸引力相当,而发现在Gardner的同一互动空间中,这两种算法融合了相互作用的相互作用。提出了对Hebbian学习的几何解释,以解释其最佳性能。由于Hopfield模型也是无序磁系统的典型模型,因此有可能将我们的结果转换为其他感兴趣的模型以存储材料中的存储器。

The Hebbian unlearning algorithm, i.e. an unsupervised local procedure used to improve the retrieval properties in Hopfield-like neural networks, is numerically compared to a supervised algorithm to train a linear symmetric perceptron. We analyze the stability of the stored memories: basins of attraction obtained by the Hebbian unlearning technique are found to be comparable in size to those obtained in the symmetric perceptron, while the two algorithms are found to converge in the same region of Gardner's space of interactions, having followed similar learning paths. A geometric interpretation of Hebbian unlearning is proposed to explain its optimal performances. Because the Hopfield model is also a prototypical model of disordered magnetic system, it might be possible to translate our results to other models of interest for memory storage in materials.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源