论文标题

重新思考基于变压器的设置预测以进行对象检测

Rethinking Transformer-based Set Prediction for Object Detection

论文作者

Sun, Zhiqing, Cao, Shengcao, Yang, Yiming, Kitani, Kris

论文摘要

DEDR是一种最近提出的基于变压器的方法,它将对象检测视为设定的预测问题并实现最先进的性能,但需要超长的训练时间进行融合。在本文中,我们研究了培训训练难度的原因。我们的检查揭示了有助于DETR缓慢收敛的几个因素,主要是匈牙利损失和变压器跨注意机制的问题。为了克服这些问题,我们提出了两种解决方案,即TSP-FCO(基于FCOS的基于变压器的设置预测)和TSP-RCNN(基于RCNN的基于变压器的集合预测)。实验结果表明,所提出的方法不仅比原始DETR快得多,而且在检测准确性方面也明显超过了DEDR和其他基准。

DETR is a recently proposed Transformer-based method which views object detection as a set prediction problem and achieves state-of-the-art performance but demands extra-long training time to converge. In this paper, we investigate the causes of the optimization difficulty in the training of DETR. Our examinations reveal several factors contributing to the slow convergence of DETR, primarily the issues with the Hungarian loss and the Transformer cross-attention mechanism. To overcome these issues we propose two solutions, namely, TSP-FCOS (Transformer-based Set Prediction with FCOS) and TSP-RCNN (Transformer-based Set Prediction with RCNN). Experimental results show that the proposed methods not only converge much faster than the original DETR, but also significantly outperform DETR and other baselines in terms of detection accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源