论文标题

张量网络回归的交互分解

Interaction Decompositions for Tensor Network Regression

论文作者

Convy, Ian, Whaley, K. Birgitta

论文摘要

众所周知,张量网络回归模型在指数较大的特征空间上运行,但是关于它们能够有效地利用此空间的有效性仍然存在问题。使用多项式特征,我们提出相互作用分解作为一种工具,可以评估不同回归器的相对重要性作为其多项式程度的函数。我们将这种分解应用于在MNIST和时尚MNIST数据集中训练的张量环和树张量网络模型,并发现多达75%的交互作用度对这些模型有意义地贡献了。我们还引入了一种新型的张量网络模型,该模型仅在相互作用的一小部分上进行明确训练,并发现这些模型只能使用指数特征空间的一小部分匹配甚至超越整个模型。这表明标准张量网络模型以低效的方式利用其多项式回归器,较低的程度术语被大大不足。

It is well known that tensor network regression models operate on an exponentially large feature space, but questions remain as to how effectively they are able to utilize this space. Using a polynomial featurization, we propose the interaction decomposition as a tool that can assess the relative importance of different regressors as a function of their polynomial degree. We apply this decomposition to tensor ring and tree tensor network models trained on the MNIST and Fashion MNIST datasets, and find that up to 75% of interaction degrees are contributing meaningfully to these models. We also introduce a new type of tensor network model that is explicitly trained on only a small subset of interaction degrees, and find that these models are able to match or even outperform the full models using only a fraction of the exponential feature space. This suggests that standard tensor network models utilize their polynomial regressors in an inefficient manner, with the lower degree terms being vastly under-utilized.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源