论文标题

关于神经普通微分方程中的数值集成

On Numerical Integration in Neural Ordinary Differential Equations

论文作者

Zhu, Aiqing, Jin, Pengzhan, Zhu, Beibei, Tang, Yifa

论文摘要

普通微分方程和神经网络的组合,即神经普通微分方程(神经ode),已从各个角度广泛研究。但是,在神经ode中解密的数值整合仍然是一个开放的挑战,因为许多研究表明,数值整合会显着影响模型的性能。在本文中,我们提出了反修改的微分方程(IMDE),以阐明数值整合对训练神经模型的影响。 IMDE取决于学习任务和受雇的ODE求解器。结果表明,训练神经模型实际上返回了IMDE的密切近似值,而不是真实的ODE。在IMDE的帮助下,我们推断出(i)学习模型与真实颂歌之间的差异是由离散误差和学习损失的总和界定的; (ii)使用非元素数值整合的神经颂歌理论上无法学习保护定律。进行了几项实验以在数值上验证我们的理论分析。

The combination of ordinary differential equations and neural networks, i.e., neural ordinary differential equations (Neural ODE), has been widely studied from various angles. However, deciphering the numerical integration in Neural ODE is still an open challenge, as many researches demonstrated that numerical integration significantly affects the performance of the model. In this paper, we propose the inverse modified differential equations (IMDE) to clarify the influence of numerical integration on training Neural ODE models. IMDE is determined by the learning task and the employed ODE solver. It is shown that training a Neural ODE model actually returns a close approximation of the IMDE, rather than the true ODE. With the help of IMDE, we deduce that (i) the discrepancy between the learned model and the true ODE is bounded by the sum of discretization error and learning loss; (ii) Neural ODE using non-symplectic numerical integration fail to learn conservation laws theoretically. Several experiments are performed to numerically verify our theoretical analysis.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源