论文标题

通过图像分类中的早期停顿,通过神经建筑搜索进行两阶段的体系结构微调

Two-stage architectural fine-tuning with neural architecture search using early-stopping in image classification

论文作者

Kim, Youngkee, Yun, Won Joon, Lee, Youn Kyu, Jung, Soyi, Kim, Joongheon

论文摘要

在许多深层神经网络(DNN)应用中,在行业领域收集高质量数据的困难阻碍了DNN的实际使用。因此,转移学习的概念已经出现,该概念利用了在大规模数据集中训练的DNN的验证知识。因此,本文提出了受神经体系结构搜索(NAS)的启发的两阶段建筑微调。主要思想之一是突变,它使用给定的架构信息降低了搜索成本。此外,还考虑了提前终止搜索过程来降低NAS成本的早期。实验结果验证我们提出的方法可降低32.4%的计算和22.3%的搜索成本。

In many deep neural network (DNN) applications, the difficulty of gathering high-quality data in the industry field hinders the practical use of DNN. Thus, the concept of transfer learning has emerged, which leverages the pretrained knowledge of DNNs trained on large-scale datasets. Therefore, this paper suggests two-stage architectural fine-tuning, inspired by neural architecture search (NAS). One of main ideas is mutation, which reduces the search cost using given architectural information. Moreover, early-stopping is considered which cuts NAS costs by terminating the search process in advance. Experimental results verify our proposed method reduces 32.4% computational and 22.3% searching costs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源