Long-term prediction of chaotic systems with recurrent neural networks

Fan, Huawei, Jiang, Junjie, Zhang, Chun, Wang, Xingang, Lai, Ying-Cheng

arXiv.org Machine Learning 

The prediction horizon demonstrated has been about half dozen Lyapunov time. Is it possible to significantly extend the prediction time beyond what has been achieved so far? We articulate a scheme incorporating time-dependent but sparse data inputs into reservoir computing and demonstrate that such rare "updates" of the actual state practically enable an arbitrarily long prediction horizon for a variety of chaotic systems. A physical understanding based on the theory of temporal synchronization is developed. Starting from the same initial condition, a well-trained reservoir system can generate a trajectory that stays close to that of the target system for a finite amount of time, realizing short-term prediction.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found