ReservoirComputing.jl: An Efficient and Modular Library for Reservoir Computing Models
Martinuzzi, Francesco, Rackauckas, Chris, Abdelrehim, Anas, Mahecha, Miguel D., Mora, Karin
–arXiv.org Artificial Intelligence
Time series modeling is a very common technique throughout many areas of machine learning. However, many standard recurrent models are known to be susceptible to problems such as the vanishing gradient [Pascanu et al., 2013] or the extreme sensitivity of chaotic systems to their parameterization [Wiggins et al., 2003]. To counter these issues reservoir computing (RC) techniques were introduced as recurrent models which can be trained without requiring gradient-based approaches [Lukoševičius and Jaeger, 2009]. Independently proposed as echo state networks (ESNs) [Jaeger, 2001] and liquid state machines (LSMs) [Maass et al., 2002], these architectures are based on the expansion of the input data using a fixed random internal layer, known as the reservoir, and the subsequent mapping of the reservoir to match an output.
arXiv.org Artificial Intelligence
Apr-8-2022