论文标题
PCT:点云变压器
PCT: Point cloud transformer
论文作者
论文摘要
不规则的领域和缺乏订购使设计深度神经网络以进行点云处理使其具有挑战性。本文为点云学习提供了一个名为Point Cloud Transformer(PCT)的新颖框架。 PCT基于变压器,该变压器在自然语言处理中取得了巨大成功,并在图像处理中显示出巨大的潜力。它本质上是处理一系列点的序列不变的,因此非常适合点云学习。为了更好地捕获点云中的本地上下文,我们在最远的点采样和最近的邻居搜索的支持下增强了输入嵌入。广泛的实验表明,PCT在形状分类,零件分割和正常估计任务上实现了最先进的性能。
The irregular domain and lack of ordering make it challenging to design deep neural networks for point cloud processing. This paper presents a novel framework named Point Cloud Transformer(PCT) for point cloud learning. PCT is based on Transformer, which achieves huge success in natural language processing and displays great potential in image processing. It is inherently permutation invariant for processing a sequence of points, making it well-suited for point cloud learning. To better capture local context within the point cloud, we enhance input embedding with the support of farthest point sampling and nearest neighbor search. Extensive experiments demonstrate that the PCT achieves the state-of-the-art performance on shape classification, part segmentation and normal estimation tasks.