论文标题

通过训练方案从一击NAS到少量NAS提高超级网络的排名相关性

Improve Ranking Correlation of Super-net through Training Scheme from One-shot NAS to Few-shot NAS

论文作者

Liu, Jiawei, Zhang, Kaiyu, Hu, Weitai, Yang, Qing

论文摘要

单发神经架构搜索(NAS)的算法已被广泛用于减少计算消耗。但是,由于共享权重的子网之间的干扰,从这些算法训练的超级网络中继承的子网在精确排名中的一致性较差。为了解决这个问题,我们提出了一个从一声NAS到几个NAS的逐步培训超网络。在培训方案中,我们首先以一种单发的方式训练超级网络,然后我们通过将它们分成多subnetnet并逐渐训练超级网络。最后,我们的方法在CVPR2022中排名第四,第三轻量化NAS挑战曲目1。我们的代码可在https://github.com/liujiawei23333/cvpr2022-nas-competition-track-1-4tholdoluty获得。

The algorithms of one-shot neural architecture search(NAS) have been widely used to reduce computation consumption. However, because of the interference among the subnets in which weights are shared, the subnets inherited from these super-net trained by those algorithms have poor consistency in precision ranking. To address this problem, we propose a step-by-step training super-net scheme from one-shot NAS to few-shot NAS. In the training scheme, we firstly train super-net in a one-shot way, and then we disentangle the weights of super-net by splitting them into multi-subnets and training them gradually. Finally, our method ranks 4th place in the CVPR2022 3rd Lightweight NAS Challenge Track1. Our code is available at https://github.com/liujiawei2333/CVPR2022-NAS-competition-Track-1-4th-solution.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源