Infinite-dimensional reservoir computing

Gonon, Lukas, Grigoryeva, Lyudmila, Ortega, Juan-Pablo

arXiv.org Artificial Intelligence 

Reservoir computing (RC) [Jaeg 10, Maas 02, Jaeg 04, Maas 11] and in particular echo state networks (ESNs) [Matt 92, Matt 93, Jaeg 04] have gained much popularity in recent years due to their excellent performance in the forecasting of dynamical systems [Grig 14, Jaeg 04, Path 17, Path 18, Lu 18, Wikn 21, Arco 22] and due to the ease of their implementation. RC aims at approximating nonlinear input/output systems using randomly generated state-space systems (called reservoirs) in which only a linear readout is estimated. It has been theoretically established that this is indeed possible in a variety of deterministic and stochastic contexts [Grig 18b, Grig 18a, Gono 20c, Gono 21b, Gono 23] in which RC systems have been shown to have universal approximation properties. In this paper, we focus on deriving error bounds for a variant of the architectures that we just cited and consider as approximants randomly generated linear systems with readouts given by randomly generated neural networks in which only the output layer is trained. Thus, from a learning perspective, we combine linear echo state networks and what is referred to in the literature as random features [Rahi 07] /extreme learning machines (ELMs) [Huan 06]. We develop explicit and readily computable approximation and estimation bounds for a newly introduced concept class whose elements we refer to as recurrent (generalized) Barron functionals since they can be viewed as a dynamical analog of the (generalized) Barron functions introduced in [Barr 92, Barr 93] and extended later in [E 20b, E 20a, E 19].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found