论文标题

有效的规模渗透主链,并具有学习资源分配

Efficient Scale-Permuted Backbone with Learned Resource Distribution

论文作者

Du, Xianzhi, Lin, Tsung-Yi, Jin, Pengchong, Cui, Yin, Tan, Mingxing, Le, Quoc, Song, Xiaodan

论文摘要

最近,Spinenet在重新网络模型上证明了对象检测和图像分类的有希望的结果。但是,目前尚不清楚在将尺度渗透的主链与先进的有效操作和复合缩放缩放结合时,改进是否会加起来。此外,Spinenet是在操作上具有统一的资源分配。尽管这种策略对于缩放量的模型似乎很普遍,但它可能并不是尺度上的模型的最佳设计。在这项工作中,我们提出了一种简单的技术,可以将有效的操作和复合缩放与以前学到的规模渗透体系结构相结合。我们证明,通过在整个网络上学习资源分布,可以进一步提高规模渗透模型的效率。在对象检测方面,产生的有效规模渗透模型优于基于最先进的有效网络模型,并在图像分类和语义分段上实现竞争性能。代码和型号将很快开源。

Recently, SpineNet has demonstrated promising results on object detection and image classification over ResNet model. However, it is unclear if the improvement adds up when combining scale-permuted backbone with advanced efficient operations and compound scaling. Furthermore, SpineNet is built with a uniform resource distribution over operations. While this strategy seems to be prevalent for scale-decreased models, it may not be an optimal design for scale-permuted models. In this work, we propose a simple technique to combine efficient operations and compound scaling with a previously learned scale-permuted architecture. We demonstrate the efficiency of scale-permuted model can be further improved by learning a resource distribution over the entire network. The resulting efficient scale-permuted models outperform state-of-the-art EfficientNet-based models on object detection and achieve competitive performance on image classification and semantic segmentation. Code and models will be open-sourced soon.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源