论文标题
当近似紧凑型功能时,Relu网络是否具有边缘?
Do ReLU Networks Have An Edge When Approximating Compactly-Supported Functions?
论文作者
论文摘要
我们研究了使用前馈神经网络实施其支持集的同时,近似紧凑型积分功能的问题。我们的第一个主要结果将这个“结构化”近似问题转录为普遍性问题。我们通过在空间上构建常规拓扑的细化来做到这一点。我们在此精致拓扑结构中建立了带有双线性池层的Relu Feedforwward网络的普遍性。因此,我们发现具有双线性合并的Relu FeedForward网络可以在实施其离散支持的同时近似紧凑的功能。我们得出了我们的通用近似定理的定量均匀版本,这些版本是在紧凑型Lipschitz函数的致密亚类上。该定量结果表达了通过目标函数的规律性,其基本支持的度量和直径以及输入和输出空间的尺寸来构建此relu网络所需的双线性池层层的深度,宽度和数量。相反,我们表明多项式回归器和分析前馈网络在该领域并非通用。
We study the problem of approximating compactly-supported integrable functions while implementing their support set using feedforward neural networks. Our first main result transcribes this "structured" approximation problem into a universality problem. We do this by constructing a refinement of the usual topology on the space $L^1_{\operatorname{loc}}(\mathbb{R}^d,\mathbb{R}^D)$ of locally-integrable functions in which compactly-supported functions can only be approximated in $L^1$-norm by functions with matching discretized support. We establish the universality of ReLU feedforward networks with bilinear pooling layers in this refined topology. Consequentially, we find that ReLU feedforward networks with bilinear pooling can approximate compactly supported functions while implementing their discretized support. We derive a quantitative uniform version of our universal approximation theorem on the dense subclass of compactly-supported Lipschitz functions. This quantitative result expresses the depth, width, and the number of bilinear pooling layers required to construct this ReLU network via the target function's regularity, the metric capacity and diameter of its essential support, and the dimensions of the inputs and output spaces. Conversely, we show that polynomial regressors and analytic feedforward networks are not universal in this space.