Memory Capacity of Nonlinear Recurrent Networks: Is it Informative?

Ballarin, Giovanni, Grigoryeva, Lyudmila, Ortega, Juan-Pablo

arXiv.org Machine Learning 

Memory capacity of nonlinear recurrent networks: Is it informative? Abstract The total memory capacity (MC) of linear recurrent neural networks (RNNs) has been proven to be equal to the rank of the corresponding Kalman controllability matrix, and it is almost surely maximal for connectivity and input weight matrices drawn from regular distributions. This fact questions the usefulness of this metric in distinguishing the performance of linear RNNs in the processing of stochastic signals. This note shows that the MC of random nonlinear RNNs yields arbitrary values within established upper and lower bounds depending just on the input process scale. This confirms that the existing definition of MC in linear and nonlinear cases has no practical value.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found