Contracting Implicit Recurrent Neural Networks: Stable Models with Improved Trainability
Revay, Max, Manchester, Ian R.
Australia Abstract Stability of recurrent models is closely linked with trainability, generalizability and in some applications, safety. Methods that train stable recurrent neural networks, however, do so at a significant cost to expressibility. We propose an implicit model structure that allows for a convex parametriza-tion of stable models using contraction analysis of nonlinear systems. Using these stability conditions we propose a new approach to model initialization and then provide a number of empirical results comparing the performance of our proposed model set to previous stable RNNs and vanilla RNNs. By carefully controlling stability in the model, we observe a significant increase in the speed of training and model performance. Keywords: System Identification, Contraction, Stability, Recurrent Neural Network, V anishing Gradient, Exploding Gradient, Nonlinear Systems, Echo State Network Notation Most of our notation is standard.
Dec-22-2019
- Country:
- Asia > Middle East
- Europe
- Germany > Berlin (0.04)
- Spain > Catalonia
- Barcelona Province > Barcelona (0.04)
- Switzerland > Basel-City
- Basel (0.04)
- North America > United States (0.14)
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Genre:
- Research Report (0.40)
- Technology: