Goto

Collaborating Authors

 Natural Language



Can logic programming execute as fast as imperative programming?

Classics

The output is assembly code for the Berkeley Abstract Machine (BAM). Directives hold starting from the next predicate that is input. Clauses do not have to be contiguous in the input stream, however, the whole stream is read before compilation starts. This manual is organized into ten sections.


A Massively Parallel Self-Tuning Context-Free Parser

Neural Information Processing Systems

ABSTRACT The Parsing and Learning System(PALS) is a massively parallel self-tuning context-free parser. It is capable of parsing sentences of unbounded length mainly due to its parse-tree representation scheme. The system is capable of improving its parsing performance through the presentation of training examples. INTRODUCTION Recent PDP research[Rumelhart et al.- 1986; Feldman and Ballard, 1982; Lippmann, 1987] involving natural language processtng[Fanty, 1988; Selman, 1985; Waltz and Pollack, 1985] have unrealistically restricted sentences to a fixed length. A solution to this problem was presented in the system CONPARSE[Charniak and Santos.


Dynamic, Non-Local Role Bindings and Inferencing in a Localist Network for Natural Language Understanding

Neural Information Processing Systems

This paper introduces a means to handle the critical problem of nonlocal role-bindingsin localist spreading-activation networks. Every conceptual node in the network broadcasts a stable, uniquely-identifying activation pattern, called its signature. A dynamic role-binding is created whena role's binding node has an activation that matches the bound concept's signature. Most importantly, signatures are propagated across long paths of nodes to handle the non-local role-bindings necessary forinferencing. Our localist network model, ROBIN (ROle Binding and Inferencing Network), uses signature activations to robustly representschemata role-bindings and thus perfonn the inferencing, plan/goal analysis, schema instantiation, word-sense disambiguation, anddynamic reinterpretation portions of the natural language understanding process.



Implications of Recursive Distributed Representations

Neural Information Processing Systems

I will describe my recent results on the automatic development of fixedwidth recursivedistributed representations of variable-sized hierarchal data structures. One implication of this wolk is that certain types of AIstyle data-structures can now be represented in fixed-width analog vectors. Simple inferences can be perfonned using the type of pattern associations that neural networks excel at Another implication arises from noting that these representations become self-similar in the limit Once this door to chaos is opened.


A Massively Parallel Self-Tuning Context-Free Parser

Neural Information Processing Systems

ABSTRACT The Parsing and Learning System(PALS) is a massively parallel self-tuning context-free parser. It is capable of parsing sentences of unbounded length mainly due to its parse-tree representation scheme. The system is capable of improving its parsing performance through the presentation of training examples. INTRODUCTION Recent PDP research[Rumelhart et al.- 1986; Feldman and Ballard, 1982; Lippmann, 1987] involving natural language processtng[Fanty, 1988; Selman, 1985; Waltz and Pollack, 1985] have unrealistically restricted sentences to a fixed length. A solution to this problem was presented in the system CONPARSE[Charniak and Santos.


Implications of Recursive Distributed Representations

Neural Information Processing Systems

I will describe my recent results on the automatic development of fixedwidth recursive distributed representations of variable-sized hierarchal data structures. One implication of this wolk is that certain types of AIstyle data-structures can now be represented in fixed-width analog vectors. Simple inferences can be perfonned using the type of pattern associations that neural networks excel at Another implication arises from noting that these representations become self-similar in the limit Once this door to chaos is opened.


Spreading Activation over Distributed Microfeatures

Neural Information Processing Systems

One att·empt at explaining human inferencing is that of spreading activat,ion, particularly in the st.ructured connectionist paradigm. This has resulted in t.he building of systems with semantically nameable nodes which perform inferencing by examining t.he pat,t.erns of activation spread.


Cognitive Models of Speech Processing: Psycholinguistic and Computational Perspectives

AI Magazine

The 1988 Workshop on Cognitive Models of Speech Processing was held at Park Hotel Fiorelle, Sperlonga, Italy, on 16-20 May 1988. Twenty-five participants gathered in this small coastal village, where the Emperor Tiberius once kept a summer house, to discuss psycholinguistic and computational issues in speech and natural language processing.