论文标题
基于事件的立体视觉进程
Event-based Stereo Visual Odometry
论文作者
论文摘要
基于事件的摄像机是受生物启发的视觉传感器,其像素可以彼此独立起作用,并通过微秒分辨率异常响应亮度变化。它们的优势使得在机器人技术(例如高速和高动态范围场景)中解决具有挑战性的方案成为可能。我们为基于立体活动的摄像头钻机获取的数据提供了一个解决方案的解决方案。我们的系统遵循一种并行的跟踪和映射方法,在该方法中,每个子问题(3D重建和摄像头姿势估计)的新颖解决方案是有两个目标的:具有原则性和高效的,用于使用商品硬件进行实时操作。为此,我们寻求最大程度地提高基于立体声事件数据的时空一致性,同时使用简单有效的表示。具体而言,映射模块通过以概率方式从多个局部观点(通过时空一致性获得)融合深度估算来构建场景的半密集3D图。跟踪模块通过求解自然由于所选地图和事件数据表示而自然出现的注册问题来恢复立体声钻机的姿势。对公开数据集和我们自己的录音进行的实验证明了通过一般6-DOF运动在自然场景中提出的方法的多功能性。该系统成功利用了基于事件的相机的优势来在挑战性的照明条件下(例如弱光和高动态范围)执行视觉进程,同时在标准CPU上实时运行。我们将软件和数据集以开源许可下发布,以在基于事件的SLAM的新兴主题中培养研究。
Event-based cameras are bio-inspired vision sensors whose pixels work independently from each other and respond asynchronously to brightness changes, with microsecond resolution. Their advantages make it possible to tackle challenging scenarios in robotics, such as high-speed and high dynamic range scenes. We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig. Our system follows a parallel tracking-and-mapping approach, where novel solutions to each subproblem (3D reconstruction and camera pose estimation) are developed with two objectives in mind: being principled and efficient, for real-time operation with commodity hardware. To this end, we seek to maximize the spatio-temporal consistency of stereo event-based data while using a simple and efficient representation. Specifically, the mapping module builds a semi-dense 3D map of the scene by fusing depth estimates from multiple local viewpoints (obtained by spatio-temporal consistency) in a probabilistic fashion. The tracking module recovers the pose of the stereo rig by solving a registration problem that naturally arises due to the chosen map and event data representation. Experiments on publicly available datasets and on our own recordings demonstrate the versatility of the proposed method in natural scenes with general 6-DoF motion. The system successfully leverages the advantages of event-based cameras to perform visual odometry in challenging illumination conditions, such as low-light and high dynamic range, while running in real-time on a standard CPU. We release the software and dataset under an open source licence to foster research in the emerging topic of event-based SLAM.