论文标题
机器人的事件驱动的视觉感测和学习
Event-Driven Visual-Tactile Sensing and Learning for Robots
论文作者
论文摘要
这项工作贡献了事件驱动的视觉感知系统,包括一种新型的生物学触觉传感器和基于多模式的尖峰学习。由于其基于事件的性质,我们的神经形态指尖触觉传感器Neutouch与群众的数量很好。同样,我们的视觉效果峰值神经网络(VT-SNN)与事件传感器结合时可以快速感知。我们在两个机器人任务上评估了视觉效果系统(使用Neutouch和Prophesee事件摄像头):容器分类和旋转滑移检测。在这两个任务上,我们都相对于标准深度学习方法观察到了良好的精度。我们已经自由使用视觉效果数据集,以鼓励对多模式事件驱动的机器人感知进行研究,我们认为这是实现智能强力效率机器人系统的一种有希望的方法。
This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.