论文标题

深层神经网络

Deep Polynomial Neural Networks

论文作者

Chrysos, Grigorios, Moschoglou, Stylianos, Bouritsas, Giorgos, Deng, Jiankang, Panagakis, Yannis, Zafeiriou, Stefanos

论文摘要

深度卷积神经网络(DCNN)目前是生成生成的选择,以及计算机视觉和机器学习中的歧视性学习。 DCNN的成功可以归因于其构建块的仔细选择(例如,剩余块,整流器,复杂的归一化方案,但要提到一些)。在本文中,我们提出了$π$ -NET,这是一种基于多项式扩展的新类函数近似器。 $π$ -NET是多项式神经网络,即输出是输入的高阶多项式。未知的参数自然地由高阶张量表示,是通过集体张量分解和因子共享来估计的。我们介绍了三个张量分解,可显着减少参数的数量,并显示如何通过层次神经网络有效地实现它们。我们从经验上证明,$π$ -NET非常表现力,它们甚至在不使用非线性激活函数的情况下产生良好的结果,即在大量的任务和信号中,即图像,图形和音频。当与激活功能结合使用时,$π$ -NET会产生最先进的任务,即三个具有挑战性的任务,即图像生成,面部验证和3D网格表示学习。源代码可在\ url {https://github.com/grigorisg9gr/polynomial_nets}中获得。

Deep Convolutional Neural Networks (DCNNs) are currently the method of choice both for generative, as well as for discriminative learning in computer vision and machine learning. The success of DCNNs can be attributed to the careful selection of their building blocks (e.g., residual blocks, rectifiers, sophisticated normalization schemes, to mention but a few). In this paper, we propose $Π$-Nets, a new class of function approximators based on polynomial expansions. $Π$-Nets are polynomial neural networks, i.e., the output is a high-order polynomial of the input. The unknown parameters, which are naturally represented by high-order tensors, are estimated through a collective tensor factorization with factors sharing. We introduce three tensor decompositions that significantly reduce the number of parameters and show how they can be efficiently implemented by hierarchical neural networks. We empirically demonstrate that $Π$-Nets are very expressive and they even produce good results without the use of non-linear activation functions in a large battery of tasks and signals, i.e., images, graphs, and audio. When used in conjunction with activation functions, $Π$-Nets produce state-of-the-art results in three challenging tasks, i.e. image generation, face verification and 3D mesh representation learning. The source code is available at \url{https://github.com/grigorisg9gr/polynomial_nets}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源