论文标题

仿射函数的本地Lipschitz常数的分析界限

Analytical bounds on the local Lipschitz constants of affine-ReLU functions

论文作者

Avant, Trevor, Morgansen, Kristi A.

论文摘要

在本文中,我们确定了由整流线性单元(Relus)组成的仿射功能的局部Lipschitz常数的分析界限。仿射 - relu函数代表了深神经网络中广泛使用的层,因为卷积,完全连接和归一化函数都是仿射的事实,并且经常遵循relu激活函数。使用分析方法,我们在数学上确定了仿射函数的局部Lipschitz常数上的上限,展示如何将这些界限组合在一起以确定整个网络上的绑定,并讨论如何有效地计算边界,即使对于较大的层和网络。我们通过将结果应用于Alexnet以及基于MNIST和CIFAR-10数据集的几个较小的网络来显示几个示例。结果表明,我们的方法比标准的保守界(即层线性矩阵的光谱规范的乘积)产生更紧密的边界,尤其是对于小扰动。

In this paper, we determine analytical bounds on the local Lipschitz constants of of affine functions composed with rectified linear units (ReLUs). Affine-ReLU functions represent a widely used layer in deep neural networks, due to the fact that convolution, fully-connected, and normalization functions are all affine, and are often followed by a ReLU activation function. Using an analytical approach, we mathematically determine upper bounds on the local Lipschitz constant of an affine-ReLU function, show how these bounds can be combined to determine a bound on an entire network, and discuss how the bounds can be efficiently computed, even for larger layers and networks. We show several examples by applying our results to AlexNet, as well as several smaller networks based on the MNIST and CIFAR-10 datasets. The results show that our method produces tighter bounds than the standard conservative bound (i.e. the product of the spectral norms of the layers' linear matrices), especially for small perturbations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源