论文标题

AANET:用于有效立体声匹配的自适应聚合网络

AANet: Adaptive Aggregation Network for Efficient Stereo Matching

论文作者

Xu, Haofei, Zhang, Juyong

论文摘要

尽管基于学习的立体声匹配算法取得了显着进展,但仍未解决一个关键挑战。当前最新的立体声模型主要基于昂贵的3D卷积,立方计算复杂性和高内存消耗使得在现实世界应用中部署它非常昂贵。在本文中,我们旨在完全替换常用的3D卷积,以实现快速推理速度,同时保持可比的精度。为此,我们首先提出了一种基于稀疏积分的基于尺度的成本聚合方法,以减轻在差异不连续性下著名的边缘腐烂问题。此外,我们将传统的跨尺度成本汇总算法与神经网络层进行处理,以处理大型无纹理区域。这两个模块都是简单,轻巧且互补的,从而为成本汇总提供了有效,有效的体系结构。有了这两个模块,我们不仅可以显着加快现有表现最佳型号(例如$ 41 \ times $ $ $ $ $ $ 4 \ times $,$ 4 \ tims $ et ant PSMNET和$ 38 \ times $远比Ga-net),而且还提高了快速立体型号的性能(例如,stereonet)。我们还以62ms运行时,我们还可以在场景流和Kitti数据集上获得竞争成果,以证明该方法的多功能性和高效率。我们的完整框架可在https://github.com/haofeixu/aanet上找到。

Despite the remarkable progress made by learning based stereo matching algorithms, one key challenge remains unsolved. Current state-of-the-art stereo models are mostly based on costly 3D convolutions, the cubic computational complexity and high memory consumption make it quite expensive to deploy in real-world applications. In this paper, we aim at completely replacing the commonly used 3D convolutions to achieve fast inference speed while maintaining comparable accuracy. To this end, we first propose a sparse points based intra-scale cost aggregation method to alleviate the well-known edge-fattening issue at disparity discontinuities. Further, we approximate traditional cross-scale cost aggregation algorithm with neural network layers to handle large textureless regions. Both modules are simple, lightweight, and complementary, leading to an effective and efficient architecture for cost aggregation. With these two modules, we can not only significantly speed up existing top-performing models (e.g., $41\times$ than GC-Net, $4\times$ than PSMNet and $38\times$ than GA-Net), but also improve the performance of fast stereo models (e.g., StereoNet). We also achieve competitive results on Scene Flow and KITTI datasets while running at 62ms, demonstrating the versatility and high efficiency of the proposed method. Our full framework is available at https://github.com/haofeixu/aanet .

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源