论文标题
对对抗性强大的高斯分类的尖锐统计保证
Sharp Statistical Guarantees for Adversarially Robust Gaussian Classification
论文作者
论文摘要
对抗性鲁棒性已成为现代机器学习应用中的基本要求。然而,到目前为止,统计的理解很少。在本文中,根据\ cite {schmidt2018 forseversly verseversly versive}提出的高斯混合模型,我们提供了最佳的最小值保证对对抗性稳健分类的多余风险的第一个结果。结果是根据对抗性信噪比(ADVSNR)表示的,该比例(ADVSNR)将标准线性分类的类似概念概述到对抗性设置。对于具有$ r $的ADVSNR值的高斯混合物,我们建立了$θ(e^{ - (\ frac {1} {8} {8}+o(1))r^2} \ frac {d} {d} {n} {n})$的多余风险下限(e^{ - (\ frac {1} {1} {8}+o(1))$,并设计了一个计算高效的估计器,可实现此最佳速率。我们的结果建立在最小的假设基础上,而涵盖了各种各样的对抗扰动,包括任何$ p \ ge 1 $的$ \ ell_p $ balls。
Adversarial robustness has become a fundamental requirement in modern machine learning applications. Yet, there has been surprisingly little statistical understanding so far. In this paper, we provide the first result of the optimal minimax guarantees for the excess risk for adversarially robust classification, under Gaussian mixture model proposed by \cite{schmidt2018adversarially}. The results are stated in terms of the Adversarial Signal-to-Noise Ratio (AdvSNR), which generalizes a similar notion for standard linear classification to the adversarial setting. For the Gaussian mixtures with AdvSNR value of $r$, we establish an excess risk lower bound of order $Θ(e^{-(\frac{1}{8}+o(1)) r^2} \frac{d}{n})$ and design a computationally efficient estimator that achieves this optimal rate. Our results built upon minimal set of assumptions while cover a wide spectrum of adversarial perturbations including $\ell_p$ balls for any $p \ge 1$.