论文标题
SCNET:训练推理样本一致性,例如分段
SCNet: Training Inference Sample Consistency for Instance Segmentation
论文作者
论文摘要
级联体系结构在对象检测和实例细分方面具有显着的性能提高。但是,关于培训和推理之间样品的交叉点(IOU)分布的差异存在挥之不去的问题。这种差异可能会加剧检测准确性。本文提出了一种称为样本一致性网络(SCNET)的体系结构,以确保训练时样品的分布在推理时与该样品的分布接近。此外,SCNET结合了功能继电器,并利用全局上下文信息进一步加强了分类,检测和分割子任务之间的相互关系。在标准可可数据集上进行的广泛实验揭示了该方法对多个评估指标的有效性,包括Box AP,Mask AP和推理速度。特别是,与强级联R-CNN基线相比,提出的SCNET速度更快地运行38%,但提出的SCNET分别将盒子的AP和掩码预测提高了1.3和2.3点。代码可在\ url {https://github.com/thangvubk/scnet}中获得。
Cascaded architectures have brought significant performance improvement in object detection and instance segmentation. However, there are lingering issues regarding the disparity in the Intersection-over-Union (IoU) distribution of the samples between training and inference. This disparity can potentially exacerbate detection accuracy. This paper proposes an architecture referred to as Sample Consistency Network (SCNet) to ensure that the IoU distribution of the samples at training time is close to that at inference time. Furthermore, SCNet incorporates feature relay and utilizes global contextual information to further reinforce the reciprocal relationships among classifying, detecting, and segmenting sub-tasks. Extensive experiments on the standard COCO dataset reveal the effectiveness of the proposed method over multiple evaluation metrics, including box AP, mask AP, and inference speed. In particular, while running 38\% faster, the proposed SCNet improves the AP of the box and mask predictions by respectively 1.3 and 2.3 points compared to the strong Cascade Mask R-CNN baseline. Code is available at \url{https://github.com/thangvubk/SCNet}.