Recurrent Neural Networks Can Learn to Implement Symbol-Sensitive Counting

Rodriguez, Paul, Wiles, Janet

Neural Information Processing Systems 

Recently researchers have derived formal complexity analysis of analog computation in the setting of discrete-time dynamical systems. As an empirical constrast, training recurrent neural networks (RNNs) produces self -organized systems that are realizations of analog mechanisms. Previous work showed that a RNN can learn to process a simple context-free language (CFL) by counting. Herein, we extend that work to show that a RNN can learn a harder CFL, a simple palindrome, by organizing its resources into a symbol-sensitive counting solution, and we provide a dynamical systems analysis which demonstrates how the network: can not only count, but also copy and store counting infonnation. 1 INTRODUCTION Several researchers have recently derived results in analog computation theory in the setting of discrete-time dynamical systems(Siegelmann, 1994; Maass & Opren, 1997; Moore, 1996; Casey, 1996). For example, a dynamical recognizer (DR) is a discrete-time continuous dynamical system with a given initial starting point and a finite set of Boolean output decision functions(pollack.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found