论文标题

与条件神经过程

Category-Agnostic 6D Pose Estimation with Conditional Neural Processes

论文作者

Li, Yumeng, Gao, Ning, Ziesche, Hanna, Neumann, Gerhard

论文摘要

我们提出了一种新型的元学习方法,用于对未知物体的6D姿势估计。与``实例级别''和`类别级别的姿势估计方法相反,我们的算法以类别 - 不可能的方式学习对象表示,从而赋予对象类别的强大概括能力。具体而言,我们采用基于神经过程的元学习方法来训练编码器,以潜在表示中的对象捕获对象的纹理和几何形状,这是基于很少的RGB-D图像和地面真相关键点。然后,同时元训练的解码器使用潜在表示,以预测新图像中对象的6D姿势。此外,我们使用图神经网络(GNN)提出了一个新颖的几何感知解码器,用于关键点预测,该预测明确地考虑了每个对象特定的几何约束。为了评估我们的算法,在\ lineMod数据集以及我们在多个场景(MCMS)中从多个类别生成的新的全已注销的合成数据集进行了广泛的实验。实验结果表明,我们的模型在具有截然不同的形状和外观的看不见对象上表现良好。值得注意的是,我们的模型还显示了封闭场景的稳健性能,尽管对数据进行了充分的训练而没有阻塞。据我们所知,这是探索\ textbf {跨类别级别} 6D姿势估计的第一部作品。

We present a novel meta-learning approach for 6D pose estimation on unknown objects. In contrast to ``instance-level" and ``category-level" pose estimation methods, our algorithm learns object representation in a category-agnostic way, which endows it with strong generalization capabilities across object categories. Specifically, we employ a neural process-based meta-learning approach to train an encoder to capture texture and geometry of an object in a latent representation, based on very few RGB-D images and ground-truth keypoints. The latent representation is then used by a simultaneously meta-trained decoder to predict the 6D pose of the object in new images. Furthermore, we propose a novel geometry-aware decoder for the keypoint prediction using a Graph Neural Network (GNN), which explicitly takes geometric constraints specific to each object into consideration. To evaluate our algorithm, extensive experiments are conducted on the \linemod dataset, and on our new fully-annotated synthetic datasets generated from Multiple Categories in Multiple Scenes (MCMS). Experimental results demonstrate that our model performs well on unseen objects with very different shapes and appearances. Remarkably, our model also shows robust performance on occluded scenes although trained fully on data without occlusion. To our knowledge, this is the first work exploring \textbf{cross-category level} 6D pose estimation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源