论文标题
深层网络结构,可适应域之外的内在维度
A deep network construction that adapts to intrinsic dimensionality beyond the domain
论文作者
论文摘要
我们通过具有relu激活的深网络研究两层组合物$ f(x)= g(ϕ(x))$的近似值,其中$ ϕ $是几何直观的,维度降低的特征图。我们专注于$ ϕ $的两个直观且实际相关的选择:对低维嵌入式子手机的投影,以及与低维度集合的距离。我们达到了接近最佳的近似速率,这仅取决于降低地图$ ϕ $而不是环境维度的复杂性。由于$ ϕ $封装了所有对功能$ f $的物质的非线性功能,因此这表明深网忠实于由$ f $控制的固有维度,而不是$ f $的域的复杂性。特别地,使用$ f(x)= g(x)= g(ϕ(x))$的近似歧管上近似函数的普遍假设可以显着放松,而$ ϕ $代表代表正交投影到同一歧管上的$。
We study the approximation of two-layer compositions $f(x) = g(ϕ(x))$ via deep networks with ReLU activation, where $ϕ$ is a geometrically intuitive, dimensionality reducing feature map. We focus on two intuitive and practically relevant choices for $ϕ$: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets. We achieve near optimal approximation rates, which depend only on the complexity of the dimensionality reducing map $ϕ$ rather than the ambient dimension. Since $ϕ$ encapsulates all nonlinear features that are material to the function $f$, this suggests that deep nets are faithful to an intrinsic dimension governed by $f$ rather than the complexity of the domain of $f$. In particular, the prevalent assumption of approximating functions on low-dimensional manifolds can be significantly relaxed using functions of type $f(x) = g(ϕ(x))$ with $ϕ$ representing an orthogonal projection onto the same manifold.