Recurrent Neural Networks Can Learn to Implement Symbol-Sensitive Counting
–Neural Information Processing Systems
Recently researchers have derived formal complexity analysis of analog computation in the setting of discrete-time dynamical systems. As an empirical constrast, training recurrent neural networks (RNNs) produces self -organized systems that are realizations of analog mechanisms. Previous workshowed that a RNN can learn to process a simple context-free language (CFL) by counting. Herein, we extend that work to show that a RNN can learn a harder CFL, a simple palindrome, by organizing its resources intoa symbol-sensitive counting solution, and we provide a dynamical systemsanalysis which demonstrates how the network: can not only count, but also copy and store counting infonnation. 1 INTRODUCTION Several researchers have recently derived results in analog computation theory in the setting ofdiscrete-time dynamical systems(Siegelmann, 1994; Maass & Opren, 1997; Moore, 1996; Casey, 1996). For example, a dynamical recognizer (DR) is a discrete-time continuous dynamicalsystem with a given initial starting point and a finite set of Boolean output decision functions(pollack.
Neural Information Processing Systems
Dec-31-1998
- Country:
- North America > United States
- California > San Diego County (0.14)
- Oceania > Australia (0.28)
- North America > United States
- Technology: