论文标题

单向薄适配器,可有效适应深神经网络

Unidirectional Thin Adapter for Efficient Adaptation of Deep Neural Networks

论文作者

Sun, Han Gyel, Ahn, Hyunjae, Lee, HyunGyu, Kim, Injung

论文摘要

在本文中,我们提出了一个新的适配器网络,用于将预先训练的深神经网络调整为具有最小计算的目标域。提出的模型单向薄适配器(UDTA)通过提供补充骨干网络的辅助功能来帮助分类器适应新数据。 UDTA从主链的多层将输出作为输入功能,但不会向主链传输任何功能。结果,UDTA可以在不计算主链梯度的情况下学习,从而为训练节省了计算。此外,由于UDTA在不修改主链的情况下学习了目标任务,因此单个骨干可以通过单独学习UDTA来适应多个任务。在由少数样品组成的五个细粒分类数据集的实验中,与传统适配器模型相比,UDTA显着减少了反向传播所需的计算和训练时间,同时显示出可比性甚至提高的精度。

In this paper, we propose a new adapter network for adapting a pre-trained deep neural network to a target domain with minimal computation. The proposed model, unidirectional thin adapter (UDTA), helps the classifier adapt to new data by providing auxiliary features that complement the backbone network. UDTA takes outputs from multiple layers of the backbone as input features but does not transmit any feature to the backbone. As a result, UDTA can learn without computing the gradient of the backbone, which saves computation for training significantly. In addition, since UDTA learns the target task without modifying the backbone, a single backbone can adapt to multiple tasks by learning only UDTAs separately. In experiments on five fine-grained classification datasets consisting of a small number of samples, UDTA significantly reduced computation and training time required for backpropagation while showing comparable or even improved accuracy compared with conventional adapter models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源