Using Simple Recurrent Networks to Learn Fixed-Length Representations of Variable-Length Strings

AAAI Conferences 

Four connectionist models are reported that learn static representations of variable-length strings using a novel autosequencer architecture. These representations were learned as plans for a simple recurrent network to regenerate a given input sequence. Results showed that the autosequencer can be used to address the dispersion problem because the positions and identities of letters in a string were integrated over learning into the plan representations. Results also revealed a moderate degree of componentiality in the plan representations. Linguistic structures vary in length.