论文标题

当地比赛的激活学习

Activation Learning by Local Competitions

论文作者

Zhou, Hongchao

论文摘要

尽管取得了巨大的成功,但反向传播仍存在某些局限性,需要研究新的学习方法。在这项研究中,我们提出了一项具有生物学上合理的当地学习规则,该规则根据HEBB众所周知的建议改进,并发现神经元中当地竞争的无监督特征。这个简单的学习规则使创建一个称为激活学习的前向学习范式,其中神经网络的输出激活(平方输出的总和)估计输入模式的可能性,或以简单的术语“学习更多,更多地激活”。为了在一些小型古典数据集上进行分类,激活学习与使用完全连接的网络相当地进行反向传播,并且在培训样本较少或无法预测的干扰时,优于反向传播。此外,相同的训练网络可用于各种任务,包括图像生成和完成。激活学习还可以在几个现实世界数据集上实现最先进的性能,以进行异常检测。这种新的学习范式有可能统一受监督,无监督和半监督的学习,并且对对抗性攻击具有更大的抵抗力,值得深入研究。

Despite its great success, backpropagation has certain limitations that necessitate the investigation of new learning methods. In this study, we present a biologically plausible local learning rule that improves upon Hebb's well-known proposal and discovers unsupervised features by local competitions among neurons. This simple learning rule enables the creation of a forward learning paradigm called activation learning, in which the output activation (sum of the squared output) of the neural network estimates the likelihood of the input patterns, or "learn more, activate more" in simpler terms. For classification on a few small classical datasets, activation learning performs comparably to backpropagation using a fully connected network, and outperforms backpropagation when there are fewer training samples or unpredictable disturbances. Additionally, the same trained network can be used for a variety of tasks, including image generation and completion. Activation learning also achieves state-of-the-art performance on several real-world datasets for anomaly detection. This new learning paradigm, which has the potential to unify supervised, unsupervised, and semi-supervised learning and is reasonably more resistant to adversarial attacks, deserves in-depth investigation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源