论文标题

神经通用的普通微分方程,带有层变化参数

Neural Generalized Ordinary Differential Equations with Layer-varying Parameters

论文作者

Yu, Duo, Miao, Hongyu, Wu, Hulin

论文摘要

深层剩余网络(RESNETS)在各种现实世界中显示了最先进的性能。最近,将重新聚集模型重新分配并解释为对连续的普通微分方程或神经模型的解决方案。在这项研究中,我们提出了一个具有层变化参数的神经通用的普通微分方程(神经 - 理)模型,以进一步扩展神经模型以近似离散的重新系统。具体而言,我们使用非参数B-Spline函数来参数化神经 - 使模型复杂性和计算效率之间的权衡很容易平衡。证明重新结构和神经码模型是所提出的神经形模型的特殊情况。基于两个基准数据集,MNIST和CIFAR-10,我们表明,与标准神经界线相比,层变化的神经形成更灵活和通用。此外,神经学具有计算和记忆益处,同时在预测准确性方面进行了相当的效果。

Deep residual networks (ResNets) have shown state-of-the-art performance in various real-world applications. Recently, the ResNets model was reparameterized and interpreted as solutions to a continuous ordinary differential equation or Neural-ODE model. In this study, we propose a neural generalized ordinary differential equation (Neural-GODE) model with layer-varying parameters to further extend the Neural-ODE to approximate the discrete ResNets. Specifically, we use nonparametric B-spline functions to parameterize the Neural-GODE so that the trade-off between the model complexity and computational efficiency can be easily balanced. It is demonstrated that ResNets and Neural-ODE models are special cases of the proposed Neural-GODE model. Based on two benchmark datasets, MNIST and CIFAR-10, we show that the layer-varying Neural-GODE is more flexible and general than the standard Neural-ODE. Furthermore, the Neural-GODE enjoys the computational and memory benefits while performing comparably to ResNets in prediction accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源