Plotting

 Industry



A Flexible, Parallel Generator of Natural Language

AI Magazine

My Ph.D. thesis (Ward 1992, 1991)1 addressed the task of generating natural language utterances. It was motivated by two difficulties in scaling up existing generators. Current generators only accept input that are relatively poor in information, such as feature structures or lists of propositions; they are unable to deal with input rich in information, as one might expect from, for example, an expert system with a complete model of its domain or a natural language understander with good inference ability. Current generators also have a very restricted knowledge of language -- indeed, they succeed largely because they have few syntactic or lexical options available (McDonald 1987) -- and they are unable to cope with more knowledge because they deal with interactions among the various possible choices only as special cases. To address these and other issues, I built a system called FIG (flexible incremental generator). FIG is based on a single associative network that encodes lexical knowledge, syntactic knowledge, and world knowledge. Computation is done by spreading activation across the network, supplemented with a small amount of symbolic processing. Thus, FIG is a spreading activation or structured connectionist system (Feldman et al. 1988).


Applied AI News

AI Magazine

General Electric's Research and Elscint (Hackensack, NJ), a manufacturer Johnson Controls (Milwaukee, WI) Development Center (Schenectady, of medical imaging systems, has has begun deployment of a knowledge-based NY) has developed an expert system begun offering its customers a service engineering application which is being used to increase the option based on expert systems. The to increase the productivity of the speed of design of new jet engines, MasterMind system delivers troubleshooting engineering design function. The system, called Engineous, on laptop or desktop computers. The General (Menlo Park, CA), is conveyor for further processing. It problems and recommends solutions objects have become rotated.


The Sixth Annual Knowledge-Based Software Engineering Conference

AI Magazine

The Sixth Annual Knowledge-Based Software Engineering Conference (KBSE-91) was held at the Sheraton University Inn and Conference Center in Syracuse, New York, from Sunday afternoon, 22 September, through midday Wednesday, 25 September. The KBSE field is concerned with applying knowledge-based AI techniques to the problems of creating, understanding, and maintaining very large software systems.


On Seeing Robots

Classics

. It is argued that Situated Agents should be designed using a unitaryon-line computational model. The Constraint Net model of Zhang and Mackworth satis๏ฌesthat requirement. Two systems for situated perception built in our laboratory are describedto illustrate the new approach: one for visual monitoring of a robotโ€™s arm, the other forreal-time visual control of multiple robots competing and cooperating in a dynamic world.First proposal for robot soccer.Proc. VI-92, 1992. later published in a book Computer Vision: System, Theory, and Applications, pages 1-13, World Scientific Press, Singapore, 1993.




Stochastic Neurodynamics

Neural Information Processing Systems

The main point of this paper is that stochastic neural networks have a mathematical structure that corresponds quite closely with that of quantum field theory. Neural network Liouvillians and Lagrangians can be derived, just as can spin Hamiltonians and Lagrangians in QFf. It remains to show the efficacy of such a description.


Cholinergic Modulation May Enhance Cortical Associative Memory Function

Neural Information Processing Systems

Combining neuropharmacological experiments with computational modeling, we have shown that cholinergic modulation may enhance associative memory function in piriform (olfactory) cortex. We have shown that the acetylcholine analogue carbachol selectively suppresses synaptic transmission between cells within piriform cortex, while leaving input connections unaffected. When tested in a computational model of piriform cortex, this selective suppression, applied during learning, enhances associative memory performance.


A Recurrent Neural Network Model of Velocity Storage in the Vestibulo-Ocular Reflex

Neural Information Processing Systems

A three-layered neural network model was used to explore the organization of the vestibulo-ocular reflex (VOR). The dynamic model was trained using recurrent back-propagation to produce compensatory, long duration eye muscle motoneuron outputs in response to short duration vestibular afferent head velocity inputs. The network learned to produce this response prolongation, known as velocity storage, by developing complex, lateral inhibitory interactions among the interneurons. These had the low baseline, long time constant, rectified and skewed responses that are characteristic of real VOR interneurons. The model suggests that all of these features are interrelated and result from lateral inhibition.