Using Prior Knowledge in a NNPDA to Learn Context-Free Languages
Das, Sreerupa, Giles, C. Lee, Sun, Guo-Zheng
–Neural Information Processing Systems
Language inference and automata induction using recurrent neural networks has gained considerable interest in the recent years. Nevertheless, success of these models has been mostly limited to regular languages. Additional information in form of a priori knowledge has proved important and at times necessary for learning complex languages (Abu-Mostafa 1990; AI-Mashouq and Reed, 1991; Omlin and Giles, 1992; Towell, 1990). They have demonstrated that partial information incorporated in a connectionist model guides the learning process through constraints for efficient learning and better generalization. 'Ve have previously shown that the NNPDA model can learn Deterministic Context 65 66 Das, Giles, and Sun
Neural Information Processing Systems
Dec-31-1993
- Country:
- North America > United States
- California (0.15)
- Colorado (0.14)
- Maryland (0.14)
- Massachusetts (0.14)
- North America > United States
- Technology: