论文标题
将扩散,小波和正则化转换为残差网络
Translating Diffusion, Wavelets, and Regularisation into Residual Networks
论文作者
论文摘要
卷积神经网络(CNN)通常表现良好,但是它们的稳定性知之甚少。为了解决这个问题,我们考虑了信号denoising的简单原型问题,在这种方法中,基于非线性扩散,基于小波的方法和正则化等经典方法可提供可证明的稳定性保证。为了将这些保证转移到CNN,我们将这些经典方法的数值近似值解释为特定的残差网络(RESNET)体系结构。这导致了一部词典,该字典允许将扩散,收缩功能和正规人员转化为激活功能,并在四个研究社区之间进行直接沟通。在CNN方面,它不仅激发了非单身激活功能的新家族,而且还引入了用于任意数量层的本质上稳定的体系结构。
Convolutional neural networks (CNNs) often perform well, but their stability is poorly understood. To address this problem, we consider the simple prototypical problem of signal denoising, where classical approaches such as nonlinear diffusion, wavelet-based methods and regularisation offer provable stability guarantees. To transfer such guarantees to CNNs, we interpret numerical approximations of these classical methods as a specific residual network (ResNet) architecture. This leads to a dictionary which allows to translate diffusivities, shrinkage functions, and regularisers into activation functions, and enables a direct communication between the four research communities. On the CNN side, it does not only inspire new families of nonmonotone activation functions, but also introduces intrinsically stable architectures for an arbitrary number of layers.