论文标题
通过可区分功能的动态测试时间增加
Dynamic Test-Time Augmentation via Differentiable Functions
论文作者
论文摘要
经常在现实世界中发生的分配变化会降低深度学习系统的准确性,从而改善对分配变化的鲁棒性对于实际应用至关重要。为了提高鲁棒性,我们研究了一种图像增强方法,该方法在不重新识别识别模型的情况下生成识别友好的图像。我们提出了一种新型的图像增强方法Dyntta,该方法基于可区分的数据增强技术,并从许多增强图像中生成混合图像,以提高分布变化下的识别精度。除了标准数据增强外,Dyntta还结合了深度神经网络的图像转换,进一步提高了鲁棒性。由于Dyntta由可区分的功能组成,因此可以通过识别模型的分类损失直接训练它。在使用各种分类模型的广泛使用图像识别数据集的实验中,Dyntta改善了鲁棒性,几乎没有降低清洁图像的分类精度,从而优于现有方法。此外,结果表明,通过使用DYNTTA估算分配移位数据集的训练时间增加并以估计的增强量来重新验证识别模型,从而显着提高了鲁棒性。 Dyntta是需要清洁准确性和鲁棒性的应用的有前途的方法。我们的代码可在\ url {https://github.com/s-enmt/dyntta}上找到。
Distribution shifts, which often occur in the real world, degrade the accuracy of deep learning systems, and thus improving robustness to distribution shifts is essential for practical applications. To improve robustness, we study an image enhancement method that generates recognition-friendly images without retraining the recognition model. We propose a novel image enhancement method, DynTTA, which is based on differentiable data augmentation techniques and generates a blended image from many augmented images to improve the recognition accuracy under distribution shifts. In addition to standard data augmentations, DynTTA also incorporates deep neural network-based image transformation, further improving the robustness. Because DynTTA is composed of differentiable functions, it can be directly trained with the classification loss of the recognition model. In experiments with widely used image recognition datasets using various classification models, DynTTA improves the robustness with almost no reduction in classification accuracy for clean images, thus outperforming the existing methods. Furthermore, the results show that robustness is significantly improved by estimating the training-time augmentations for distribution-shifted datasets using DynTTA and retraining the recognition model with the estimated augmentations. DynTTA is a promising approach for applications that require both clean accuracy and robustness. Our code is available at \url{https://github.com/s-enmt/DynTTA}.