论文标题

T-BASIS:神经网络的紧凑表示

T-Basis: a Compact Representation for Neural Networks

论文作者

Obukhov, Anton, Rakhuba, Maxim, Georgoulis, Stamatios, Kanakis, Menelaos, Dai, Dengxin, Van Gool, Luc

论文摘要

我们介绍了T-BASIS,这是一个新颖的概念,用于对一组张量的紧凑表示,每种张量,通常在神经网络中可以看到。该集合中的每个张量均使用张量环建模,尽管该概念适用于其他张量网络。 T-BASIS在张量环的图表中的T形符号中,T-BASIS仅仅是均匀形状的三维张量的列表,用于表示张量环节点。这种表示使我们能够用少量参数(T-BASIS张量的系数)参数化张量集,从而对对数进行比较,并且每个张量的大小在集合中,并与T-BASIS的尺寸线性线性。我们评估了关于神经网络压缩任务的建议方法,并证明它在可接受的性能下降时达到了高压率。最后,我们分析了压缩网络的内存和操作要求,并得出结论,T-BASIS网络同样非常适合在资源受限环境中培训和推断,并在边缘设备上使用。

We introduce T-Basis, a novel concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks. Each of the tensors in the set is modeled using Tensor Rings, though the concept applies to other Tensor Networks. Owing its name to the T-shape of nodes in diagram notation of Tensor Rings, T-Basis is simply a list of equally shaped three-dimensional tensors, used to represent Tensor Ring nodes. Such representation allows us to parameterize the tensor set with a small number of parameters (coefficients of the T-Basis tensors), scaling logarithmically with each tensor's size in the set and linearly with the dimensionality of T-Basis. We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops. Finally, we analyze memory and operation requirements of the compressed networks and conclude that T-Basis networks are equally well suited for training and inference in resource-constrained environments and usage on the edge devices.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源