Learning Sequential Structure in Simple Recurrent Networks

Servan-Schreiber, David, Cleeremans, Axel, McClelland, James L.

Neural Information Processing Systems 

This tendency to preserve information about the path is not a characteristic of traditional finite-state automata. ENCODING PATH INFORMATION In a different set of experiments, we asked whether the SRN could learn to use the infonnation about the path that is encoded in the hidden units' patterns of activation. In one of these experiments, we tested whether the network could master length constraints. When strings generated from the small finite-state grammar may only have a maximum of 8 letters, the prediction following the presentation of the same letter in position number six or seven may be different. For example, following the sequence'TSSSXXV', 'V' is the seventh letter and only another'V' would be a legal successor.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found