论文标题

如果神经网络有SVD怎么办?

What if Neural Networks had SVDs?

论文作者

Mathiasen, Alexander, Hvilshøj, Frederik, Jørgensen, Jakob Rødsgaard, Nasery, Anshul, Mottin, Davide

论文摘要

各种神经网络采用耗时的矩阵操作,例如矩阵反转。考虑到单数值分解(SVD),许多这样的矩阵操作更快地计算。以前的工作允许在无需计算神经网络中使用SVD。从理论上讲,这些技术可以加快矩阵操作的速度,但是实际上,它们还不够快。我们提出了一种足够快的算法,可以加快几个矩阵操作。该算法提高了基础矩阵乘法$ h \ cdot x $的并行度,其中$ h $是由户主矩阵的产物代表的正交矩阵。代码可在www.github.com/alexandermath/fasth上找到。

Various Neural Networks employ time-consuming matrix operations like matrix inversion. Many such matrix operations are faster to compute given the Singular Value Decomposition (SVD). Previous work allows using the SVD in Neural Networks without computing it. In theory, the techniques can speed up matrix operations, however, in practice, they are not fast enough. We present an algorithm that is fast enough to speed up several matrix operations. The algorithm increases the degree of parallelism of an underlying matrix multiplication $H\cdot X$ where $H$ is an orthogonal matrix represented by a product of Householder matrices. Code is available at www.github.com/AlexanderMath/fasth .

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源