论文标题
理性神经网络
Rational neural networks
论文作者
论文摘要
我们考虑具有合理激活功能的神经网络。深度学习体系结构中非线性激活函数的选择至关重要,并严重影响神经网络的性能。我们在网络复杂性方面建立了最佳的界限,并证明有理神经网络比具有指数较小的深度的Relu网络更有效地近似平滑功能。正如我们通过数值实验所证明的那样,有理活化功能的灵活性和平滑性使它们成为恢复的有吸引力替代品。
We consider neural networks with rational activation functions. The choice of the nonlinear activation function in deep learning architectures is crucial and heavily impacts the performance of a neural network. We establish optimal bounds in terms of network complexity and prove that rational neural networks approximate smooth functions more efficiently than ReLU networks with exponentially smaller depth. The flexibility and smoothness of rational activation functions make them an attractive alternative to ReLU, as we demonstrate with numerical experiments.