Fahlman, Scott E.
Score: A Rule Engine for the Scone Knowledge Base System
Chen, Jeffrey, Fahlman, Scott E.
We present Score, a rule engine designed and implemented for the Scone knowledge base system. Scone is a knowledge base system designed for storing and manipulating rich representations of general knowledge in symbolic form. It represents knowledge in the form of nodes and links in a network structure, and it can perform basic inference about the relationships between different elements efficiently. On its own, Scone acts as a sort of "smart memory" that can interface with other software systems. One area of improvement for Scone is how useful it can be in supplying knowledge to an intelligent agent that can use the knowledge to perform actions and update the knowledge base with its observations. We augment the Scone system with a production rule engine that automatically performs simple inference based on existing and newly-added structures in Scone's knowledge base, potentially improving the capabilities of any planning systems built on top of Scone. Production rule systems consist of "if-then" production rules that try to match their predicates to existing knowledge and fire their actions when their predicates are satisfied. We propose two kinds of production rules, if-added and if-needed rules, that differ in how they are checked and fired to cover multiple use cases. We then implement methods to efficiently check and fire these rules in a large knowledge base. The new rule engine is not meant to be a complex stand-alone planner, so we discuss how it fits into the context of Scone and future work on planning systems.
Position Paper: Knowledge-Based Mechanisms for Deception
Fahlman, Scott E. (Carnegie Mellon University)
In an earlier paper, I described in some detail how a system based on symbolic knowledge representation and reasoning could model and reason about an act of deception encountered in a children's story. This short position paper extends that earlier work, adding new analysis and discussion about the nature of deception, the desirability of building deceptive AI systems, and the computational mechanisms necessary for deceiving others and for recognizing their attempts to deceive us.
Using Scone's Multiple-Context Mechanism to Emulate Human-Like Reasoning
Fahlman, Scott E. (Carnegie Mellon University)
Scone is a knowledge-base system developed specifically to support human-like common-sense reasoning and the understanding of human language. One of the unusual features of Scone is its multiple-context system. Each context represents a distinct world-model, but a context can inherit most of the knowledge of another context, explicitly representing just the differences. We explore how this multiple-context mechanism can be used to emulate some aspects of human mental behavior that are difficult or impossible to emulate in other representational formalisms. These include reasoning about hypothetical or counter-factual situations; understanding how the world model changes over time due to specific actions or spontaneous changes; and reasoning about the knowledge and beliefs of other agents, and how their mental state may affect the actions of those agents.
The Recurrent Cascade-Correlation Architecture
Fahlman, Scott E.
Scott E. Fahlman School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract Recurrent Cascade-Correlation CRCC) is a recurrent version of the Cascade Correlation learning architecture of FahIman and Lebiere [Fahlman, 1990]. RCC can learn from examples to map a sequence of inputs into a desired sequence of outputs. New hidden units with recurrent connections are added to the network as needed during training. In effect, the network builds up a finite-state machine tailored specifically for the current problem. RCC retains the advantages of Cascade-Correlation: fast learning, good generalization, automatic construction of a near-minimal multi-layered network, and incremental training. Initially the network contains only inputs, output units, and the connections between them.
The Recurrent Cascade-Correlation Architecture
Fahlman, Scott E.
Recurrent Cascade-Correlation CRCC) is a recurrent version of the Cascade Correlation learning architecture of Fah I man and Lebiere [Fahlman, 1990]. RCC can learn from examples to map a sequence of inputs into a desired sequence of outputs. New hidden units with recurrent connections are added to the network as needed during training. In effect, the network builds up a finite-state machine tailored specifically for the current problem. RCC retains the advantages of Cascade-Correlation: fast learning, good generalization, automatic construction of a near-minimal multi-layered network, and incremental training. Initially the network contains only inputs, output units, and the connections between them.
The Cascade-Correlation Learning Architecture
Fahlman, Scott E., Lebiere, Christian
Cascade-Correlation is a new architecture and supervised learning algorithm forartificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network,then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. TheCascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network .determines
The Cascade-Correlation Learning Architecture
Fahlman, Scott E., Lebiere, Christian
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network.