论文标题

提高DNN的鲁棒性培训的负担能力

Improving the affordability of robustness training for DNNs

论文作者

Gupta, Sidharth, Dube, Parijat, Verma, Ashish

论文摘要

基于预计的梯度下降(PGD)的对抗训练已成为建立强大的深神经网络模型的最突出方法之一。但是,由于找到对手时损失函数的最大化,与该方法相关的计算复杂性是一个长期存在的问题,并且在使用较大且更复杂的模型时可能会令人难以置信。在本文中,我们表明,对抗训练的初始阶段是多余的,可以用自然训练取代,从而显着提高了计算效率。我们证明可以实现这种效率提高,而不会在天然和对抗性测试样品上准确性损失。我们通过对对手性质及其在训练过程中的相对力量的见解来支持我们的论点。我们表明,我们所提出的方法可以将训练时间减少到2.5倍,而对对抗性攻击的各种优势,具有可比或更好的模型测试准确性和概括。

Projected Gradient Descent (PGD) based adversarial training has become one of the most prominent methods for building robust deep neural network models. However, the computational complexity associated with this approach, due to the maximization of the loss function when finding adversaries, is a longstanding problem and may be prohibitive when using larger and more complex models. In this paper we show that the initial phase of adversarial training is redundant and can be replaced with natural training which significantly improves the computational efficiency. We demonstrate that this efficiency gain can be achieved without any loss in accuracy on natural and adversarial test samples. We support our argument with insights on the nature of the adversaries and their relative strength during the training process. We show that our proposed method can reduce the training time by a factor of up to 2.5 with comparable or better model test accuracy and generalization on various strengths of adversarial attacks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源