Asymptotic evaluation of the information processing capacity in reservoir computing

Saito, Yohei

arXiv.org Artificial Intelligence 

Recurrent neural networks (RNNs) ca n store past input by recursively connecting hidden nodes [1] and can approximate the relationship between input and ou tput time series with arbitrary accuracy [2]. Backpropagation through time (BPTT) is mainly used to train RNNs, b ut it is difficult to optimize network parameters due to the gradient vanishing or the gradient explosion [3]. Many variants of RNNs, such as LSTM [4] and GRU [5], have been proposed to solve the difficulty of training and h ave been very successful. However, BPTT calculations become slower for longer training data. An echo state network (ESN) [6] is a kind of RNNs, which can finish tra ining quickly by fixing the recurrent connections at the initial value and optimizing only the linear transfor mation of the readout layer. Not limited to neural networks, a linear combination of nonlinear dynamical syste ms can be used to approximate the relationship between input and output time series and is called a reservoir comput ing (RC) system [7].