论文标题

强大的复发神经网络的凸参数化

A Convex Parameterization of Robust Recurrent Neural Networks

论文作者

Revay, Max, Wang, Ruigang, Manchester, Ian R.

论文摘要

复发性神经网络(RNN)是一类通常用于模拟序列到序列图的非线性动力学系统。 RNN具有出色的表现力,但缺乏许多应用所需的稳定性或鲁棒性保证。在本文中,我们制定了具有稳定性和鲁棒性保证的RNN凸组。保证金是使用增量二次约束得出的,并可以确保所有解决方案的全局指数稳定性,并以增量$ \ ell_2 $增益为单位(学习序列到序列映射的Lipschitz常数)。使用隐式模型结构,我们构建了RNN的参数化,该参数是模型参数和稳定证书中共同凸的参数化。我们证明,该模型结构包括所有先前提供的稳定RNN的凸集,并包括所有稳定的线性动力学系统。我们在非线性系统识别的背景下说明了所提出的模型类的实用性。

Recurrent neural networks (RNNs) are a class of nonlinear dynamical systems often used to model sequence-to-sequence maps. RNNs have excellent expressive power but lack the stability or robustness guarantees that are necessary for many applications. In this paper, we formulate convex sets of RNNs with stability and robustness guarantees. The guarantees are derived using incremental quadratic constraints and can ensure global exponential stability of all solutions, and bounds on incremental $ \ell_2 $ gain (the Lipschitz constant of the learned sequence-to-sequence mapping). Using an implicit model structure, we construct a parametrization of RNNs that is jointly convex in the model parameters and stability certificate. We prove that this model structure includes all previously-proposed convex sets of stable RNNs as special cases, and also includes all stable linear dynamical systems. We illustrate the utility of the proposed model class in the context of non-linear system identification.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源