论文标题

Centernet ++用于对象检测

CenterNet++ for Object Detection

论文作者

Duan, Kaiwen, Bai, Song, Xie, Lingxi, Qi, Honggang, Huang, Qingming, Tian, Qi

论文摘要

有两个主流供对象检测:自上而下和自下而上。最先进的方法主要属于第一类。在本文中,我们证明了自下而上的方法与自上而下,并享有更高的回忆。我们的方法命名为Centernet,将每个对象视为三重态关键点(左上角和右下角以及中心关键点)。首先,我们通过一些设计的提示将角落分组,并通过中心关键点进一步确认对象。转角关键点使该方法能够检测各种尺度和形状的对象,并且中心关键避免了大量假阳性建议带来的混乱。我们的方法是一种无锚检测器,因为它不需要定义明确的锚点框。我们将方法调整为具有不同结构的骨干,即类似网络的“沙漏”和“金字塔”等网络,它们分别在单分辨率特征映射和多分辨率特征图上检测对象。在MS-COCO数据集上,具有RES2NET-101和SWIN-TRANSFORMER的Centernet分别达到53.7%和57.1%的AP,表现优于所有现有的自下而上探测器,并实现最新的AP。我们还设计了一个实时的百分点,在30.5 fps的AP为43.6%之间,在准确性和速度之间取决于良好的权衡。 https://github.com/duankaiwen/pycenternet。

There are two mainstreams for object detection: top-down and bottom-up. The state-of-the-art approaches mostly belong to the first category. In this paper, we demonstrate that the bottom-up approaches are as competitive as the top-down and enjoy higher recall. Our approach, named CenterNet, detects each object as a triplet keypoints (top-left and bottom-right corners and the center keypoint). We firstly group the corners by some designed cues and further confirm the objects by the center keypoints. The corner keypoints equip the approach with the ability to detect objects of various scales and shapes and the center keypoint avoids the confusion brought by a large number of false-positive proposals. Our approach is a kind of anchor-free detector because it does not need to define explicit anchor boxes. We adapt our approach to the backbones with different structures, i.e., the 'hourglass' like networks and the the 'pyramid' like networks, which detect objects on a single-resolution feature map and multi-resolution feature maps, respectively. On the MS-COCO dataset, CenterNet with Res2Net-101 and Swin-Transformer achieves APs of 53.7% and 57.1%, respectively, outperforming all existing bottom-up detectors and achieving state-of-the-art. We also design a real-time CenterNet, which achieves a good trade-off between accuracy and speed with an AP of 43.6% at 30.5 FPS. https://github.com/Duankaiwen/PyCenterNet.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源