论文标题

以目标能力发现强大的卷积架构:一种多拍方法

Discovering Robust Convolutional Architecture at Targeted Capacity: A Multi-Shot Approach

论文作者

Ning, Xuefei, Zhao, Junbo, Li, Wenshuo, Zhao, Tianchen, Zheng, Yin, Yang, Huazhong, Wang, Yu

论文摘要

卷积神经网络(CNN)容易受到对抗性例子的影响,研究表明,提高建筑拓扑的模型能力(例如宽度扩展)可以带来一致的稳健性改善。这揭示了在建筑设计中应考虑的明确鲁棒性效率折衷。在本文中,考虑到具有容量预算的方案,我们旨在在目标能力下发现对抗性强大的建筑。最近的研究采用了单发神经建筑搜索(NAS)来发现强大的建筑。但是,由于在搜索过程中无法对准不同拓扑的能力,因此,单发NAS方法偏爱超级网中具有较大能力的拓扑。当增强到目标能力时,发现的拓扑结构可能是最佳的。我们提出了一种新型的多弹性NAS方法,以解决此问题并明确搜索目标能力的强大体系结构。在2000m的有针对性拖船上,发现的MSROBNET-2000在各种标准下,最近发现的NAS陷入困境Robnet-large的表现高出4%-7%。在1560m的有针对性拖船上,MSROBNET-1560在清洁和PGD-7精确度中分别超过了NAS发现的Robnet-Free 2.3%和1.3%。所有代码均可在https://github.com/walkerning/aw \_nas上找到。

Convolutional neural networks (CNNs) are vulnerable to adversarial examples, and studies show that increasing the model capacity of an architecture topology (e.g., width expansion) can bring consistent robustness improvements. This reveals a clear robustness-efficiency trade-off that should be considered in architecture design. In this paper, considering scenarios with capacity budget, we aim to discover adversarially robust architecture at targeted capacities. Recent studies employed one-shot neural architecture search (NAS) to discover robust architectures. However, since the capacities of different topologies cannot be aligned in the search process, one-shot NAS methods favor topologies with larger capacities in the supernet. And the discovered topology might be suboptimal when augmented to the targeted capacity. We propose a novel multi-shot NAS method to address this issue and explicitly search for robust architectures at targeted capacities. At the targeted FLOPs of 2000M, the discovered MSRobNet-2000 outperforms the recent NAS-discovered architecture RobNet-large under various criteria by a large margin of 4%-7%. And at the targeted FLOPs of 1560M, MSRobNet-1560 surpasses another NAS-discovered architecture RobNet-free by 2.3% and 1.3% in the clean and PGD-7 accuracies, respectively. All codes are available at https://github.com/walkerning/aw\_nas.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源