Echo state networks are universal

Grigoryeva, Lyudmila, Ortega, Juan-Pablo

arXiv.org Artificial Intelligence 

Many recently introduced machine learning techniques in the context of dynamical problems have much in common with system identification procedures developed in the last decades for applications in signal treatment, circuit theory and, in general, systems theory. In these problems, system knowledge is only available in the form of input-output observations and the task consists in finding or learning a model that approximates it for mainly forecasting or classification purposes. An important goal in that context is to find a family of transformations that is both computationally feasible and versatile enough to reproduce a rich number of patterns just by modifying a limited number of procedural parameters. This feature is usually referred to as universality. A first solution to this problem was pioneered in the works of Fréchet [Frec 10] and Volterra [Volt 30] one century ago when they proved that finite Volterra series can be used to uniformly approximate continuous functionals defined on compact sets of continuous functions. These results were further extended in the 1950s by the MIT school lead by N. Wiener [Wien 58, Bril 58, Geor 59] but always under compactness assumptions on the input space and the time interval in which inputs are defined. A major breakthrough was the generalization to infinite time intervals carried out by Boyd and Chua in [Boyd 85] using the so called fading memory property. In this paper we address that problem for transformations or filters of discrete time signals of infinite length that have the fading memory property. The approximating set that we use is generated by nonlinear state-space transformations and that is referred to as reservoir computers (RC) [Jaeg 10, Jaeg 04, Maas 02, Maas 11, Croo 07, Vers 07, Luko 09] or reservoir systems.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found