Using Prior Knowledge in a NNPDA to Learn Context-Free Languages

Das, Sreerupa, Giles, C. Lee, Sun, Guo-Zheng

Neural Information Processing Systems 

Language inference and automata induction using recurrent neural networks has gained considerable interest in the recent years. Nevertheless, success of these models hasbeen mostly limited to regular languages. Additional information in form of a priori knowledge has proved important and at times necessary for learning complex languages(Abu-Mostafa 1990; AI-Mashouq and Reed, 1991; Omlin and Giles, 1992; Towell, 1990). They have demonstrated that partial information incorporated in a connectionist model guides the learning process through constraints for efficient learning and better generalization. 'Ve have previously shown that the NNPDA model can learn Deterministic Context 65 66 Das, Giles, and Sun Output

Similar Docs  Excel Report  more

TitleSimilaritySource
None found