论文标题

非线性转换下的子空间嵌入

Subspace Embeddings Under Nonlinear Transformations

论文作者

Gajjar, Aarshvi, Musco, Cameron

论文摘要

我们考虑在\ emph {entrywise nonlinear Transformations}下的子空间的低渗透嵌入。 In particular we seek embeddings that preserve the norm of all vectors in a space $S = \{y: y = f(x)\text{ for }x \in Z\}$, where $Z$ is a $k$-dimensional subspace of $\mathbb{R}^n$ and $f(x)$ is a nonlinear activation function applied entrywise to $x$.当$ f $是身份时,因此$ s $只是$ k $维的子空间时,众所周知,有了很高的概率,将随机嵌入到$ o(k/ε^2)$ dimensions中可以保留所有$ y \ y y y in s $ in s $ in s $ in s $的标准(1 \ pm pmε)$相对错误。这种嵌入被称为\ emph {subpace嵌入},并发现在压缩感测和近似算法中广泛使用。我们为广泛的非线性函数$ f $提供了第一个低衰减嵌入。特别是,我们将添加剂$ε$错误嵌入到$ o(\ frac {k \ log(n/ε)} {ε^2})$ dimensions $ dimensions的一类非线性,其中包括流行的Sigmoid Softplus和Gaussian功能。我们加强了这一结果,以在一些进一步的限制下给出相对误差的嵌入,例如Tanh,SoftSign,指数线性单元以及许多其他“软”步骤函数和整流单元。在非线性转换下了解子空间的嵌入是向非线性问题扩展线性问题的随机素描和压缩传感技术的关键步骤。我们讨论了结果的示例应用程序,以改善通过生成神经网络压缩感测的界限。

We consider low-distortion embeddings for subspaces under \emph{entrywise nonlinear transformations}. In particular we seek embeddings that preserve the norm of all vectors in a space $S = \{y: y = f(x)\text{ for }x \in Z\}$, where $Z$ is a $k$-dimensional subspace of $\mathbb{R}^n$ and $f(x)$ is a nonlinear activation function applied entrywise to $x$. When $f$ is the identity, and so $S$ is just a $k$-dimensional subspace, it is known that, with high probability, a random embedding into $O(k/ε^2)$ dimensions preserves the norm of all $y \in S$ up to $(1\pm ε)$ relative error. Such embeddings are known as \emph{subspace embeddings}, and have found widespread use in compressed sensing and approximation algorithms. We give the first low-distortion embeddings for a wide class of nonlinear functions $f$. In particular, we give additive $ε$ error embeddings into $O(\frac{k\log (n/ε)}{ε^2})$ dimensions for a class of nonlinearities that includes the popular Sigmoid SoftPlus, and Gaussian functions. We strengthen this result to give relative error embeddings under some further restrictions, which are satisfied e.g., by the Tanh, SoftSign, Exponential Linear Unit, and many other `soft' step functions and rectifying units. Understanding embeddings for subspaces under nonlinear transformations is a key step towards extending random sketching and compressing sensing techniques for linear problems to nonlinear ones. We discuss example applications of our results to improved bounds for compressed sensing via generative neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源