论文标题

树张量网络的近似理论:张量单变量函数 - 第一部分

Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions -- Part I

论文作者

Ali, Mazen, Nouy, Anthony

论文摘要

我们研究了张量网络(TNS)的功能近似值。我们证明,可以通过张量化的任意顺序张量产品识别Lebesgue $ l^p $ - 一个维度。我们使用此张量产品结构来定义有限表示复杂性的等级结构函数的$ l^p $的子集。然后,这些子集用于定义张量网络的不同近似类别,这些类别与不同的复杂度度量相关。这些近似类显示为准字母的线性空间。我们研究上述空间的一些基本特性和关系。在这项工作的第二部分中,我们将证明经典的平滑度(BESOV)空间不断嵌入这些近似类中。我们还将证明,除非有人限制了张量网络的深度,否则这些近似类中的功能不具有任何BESOV平滑度。这项工作的结果既是对TNS的近似空间的分析,又是对特定类型的神经网络(NN)的表达性的研究,即具有稀疏体系结构的馈送前向前进的总成果网络。该网络的输入变量是由张力步骤产生的,该步骤被解释为特定的特征步骤,该步骤也可以通过具有特定体系结构的神经网络实现。我们指出了关于整流线性单元(RELU)网络的表达性的最新结果的有趣相似之处,这是当前最流行的NN类型之一。

We study the approximation of functions by tensor networks (TNs). We show that Lebesgue $L^p$-spaces in one dimension can be identified with tensor product spaces of arbitrary order through tensorization. We use this tensor product structure to define subsets of $L^p$ of rank-structured functions of finite representation complexity. These subsets are then used to define different approximation classes of tensor networks, associated with different measures of complexity. These approximation classes are shown to be quasi-normed linear spaces. We study some elementary properties and relationships of said spaces. In part II of this work, we will show that classical smoothness (Besov) spaces are continuously embedded into these approximation classes. We will also show that functions in these approximation classes do not possess any Besov smoothness, unless one restricts the depth of the tensor networks. The results of this work are both an analysis of the approximation spaces of TNs and a study of the expressivity of a particular type of neural networks (NN) -- namely feed-forward sum-product networks with sparse architecture. The input variables of this network result from the tensorization step, interpreted as a particular featuring step which can also be implemented with a neural network with a specific architecture. We point out interesting parallels to recent results on the expressivity of rectified linear unit (ReLU) networks -- currently one of the most popular type of NNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源