论文标题

保守的SPDE作为随机梯度下降的波动平均场限制

Conservative SPDEs as fluctuating mean field limits of stochastic gradient descent

论文作者

Gess, Benjamin, Gvalani, Rishabh S., Konarovskyi, Vitalii

论文摘要

建立了最佳的收敛速率,建立了在平均场限制到保守随机部分微分方程解决方案方面的随机相互作用粒子系统的收敛。作为第二个主要结果,再次得出了此类SPDE的定量中心极限定理,并以最佳的收敛速率得出。 该结果尤其适用于在过多散热的,浅的神经网络中随机梯度下降动力学的平均场缩放中的收敛性。结果表明,在限制SPDE中包含波动可以提高收敛速率,并保留有关随机梯度下降的波动的信息。

The convergence of stochastic interacting particle systems in the mean-field limit to solutions of conservative stochastic partial differential equations is established, with optimal rate of convergence. As a second main result, a quantitative central limit theorem for such SPDEs is derived, again, with optimal rate of convergence. The results apply, in particular, to the convergence in the mean-field scaling of stochastic gradient descent dynamics in overparametrized, shallow neural networks to solutions of SPDEs. It is shown that the inclusion of fluctuations in the limiting SPDE improves the rate of convergence, and retains information about the fluctuations of stochastic gradient descent in the continuum limit.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源