论文标题

用于模式识别和序列预测的端到端的回忆HTM系统

End-to-End Memristive HTM System for Pattern Recognition and Sequence Prediction

论文作者

Zyarah, Abdullah M., Gomez, Kevin, Kudithipudi, Dhireesha

论文摘要

从流媒体输入中学习和预测的神经形态系统在普遍的边缘计算及其应用中具有巨大的希望。在本文中,提出了一个在边缘上处理时空信息的神经形态系统。从算法上讲,该系统基于层次的时间内存,该记忆本质地提供了在线学习,弹性和容忍度。在建筑上,它是一个完整的自定义混合信号设计,具有基础数字通信方案和模拟计算模块。因此,所提出的系统具有可重构性,实时处理,低功耗和低延迟处理。提出的体系结构是基准的,可以在现实世界流数据上进行预测。与基线算法模型相比,混合信号系统上网络的绝对百分比误差降低了1.129倍。这种还原可以归因于设备的非理想性和突触连接的概率形成。我们证明,Hebbian学习和网络稀疏性的综合效应在延长整个网络寿命方面也起着重要作用。我们还说明,与在同一技术节点上实施的自定义CMOS数字设计相比,该系统的潜伏期减少了3.46倍,功耗减少了77.02倍。通过采用特定的低功率技术(例如时钟门控),我们观察到功耗减少了161.37倍。

Neuromorphic systems that learn and predict from streaming inputs hold significant promise in pervasive edge computing and its applications. In this paper, a neuromorphic system that processes spatio-temporal information on the edge is proposed. Algorithmically, the system is based on hierarchical temporal memory that inherently offers online learning, resiliency, and fault tolerance. Architecturally, it is a full custom mixed-signal design with an underlying digital communication scheme and analog computational modules. Therefore, the proposed system features reconfigurability, real-time processing, low power consumption, and low-latency processing. The proposed architecture is benchmarked to predict on real-world streaming data. The network's mean absolute percentage error on the mixed-signal system is 1.129X lower compared to its baseline algorithm model. This reduction can be attributed to device non-idealities and probabilistic formation of synaptic connections. We demonstrate that the combined effect of Hebbian learning and network sparsity also plays a major role in extending the overall network lifespan. We also illustrate that the system offers 3.46X reduction in latency and 77.02X reduction in power consumption when compared to a custom CMOS digital design implemented at the same technology node. By employing specific low power techniques, such as clock gating, we observe 161.37X reduction in power consumption.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源