Goto

Collaborating Authors

 Technology


A Bibliography on Hybrid Reasoning

AI Magazine

In Daniel G. Bobrow and Alan Model of Computation Based on a Calculus University of New York at Albany, 1986. On the of many sorted interpolation theorems. An investigation [Höhfeld and Smolka, 1988] Markus Höhfeld in Expert Systems III, pages 184-194, into inference with restricted and G. Smolka. A many-sorted resolution based Levesque, and Raymond Reiter, editors, 2(3):142-150, 1986. An overview in a topically organized semantic of the HORNE logic programming system.


The First International Workshop on Human and Machine Cognition, Pensacola, Florida. Topic: The Frame Problem

AI Magazine

In 1877 the Italian astronomer number of inferences about what has Program co-chairpersons are Dr. Robin Giovanni Schiaparaelli announced not changed as the result of performing Cohen of the University of Waterloo, the existence of canali on Mars: a network some action A while allowing the Bob Kass of the EDS Center for of straight and curved lines running small number of inferences about Machine Intelligence, and Cecile Paris across the planet. Canali, meaning what has changed as a result of A. of the Information Sciences Institute.


Thoughts and Afterthoughts on the 1988 Workshop on Principles of Hybrid Reasoning

AI Magazine

The 1988 Workshop on Principles of Hybrid Reasoning, a one-day AAAI-sponsored workshop, was held in St. Paul, Minnesota on August 21, 1988, in conjunction with the National Conference on Artificial Intelligence. This article reports on the workshop and presents some of our afterthoughts based upon prolonged discussion of the issues that arose during the workshop.


Knowledge Discovery in Real Databases: A Report on the IJCAI-89 Workshop

AI Magazine

The growth in the amount of available databases far outstrips the growth of corresponding knowledge. This creates both a need and an opportunity for extracting knowledge from databases. Many recent results have been reported on extracting different kinds of knowledge from databases, including diagnostic rules, drug side effects, classes of stars, rules for expert systems, and rules for semantic query optimization.


Task Communication Through Natural Language and Graphics

AI Magazine

With increases in the complexity of information that must be communicated either by or to computer comes a corresponding need to find ways to communicate that information simply and effectively. It makes little sense to force the burden of communication on a single medium, restricted to just one of spoken or written text, gestures, diagrams, or graphical animation, when in many situations information is only communicated effectively through combinations of media.


Using Local Models to Control Movement

Neural Information Processing Systems

This paper explores the use of a model neural network for motor learning. Steinbuch and Taylor presented neural network designs to do nearest neighbor lookup in the early 1960s. In this paper their nearest neighbor network is augmented with a local model network, which fits a local model to a set of nearest neighbors. The network design is equivalent to local regression. This network architecture can represent smooth nonlinear functions, yet has simple training rules with a single global optimum.


Adjoint Operator Algorithms for Faster Learning in Dynamical Neural Networks

Neural Information Processing Systems

A methodology for faster supervised learning in dynamical nonlinear neuralnetworks is presented. It exploits the concept of adjoint operntors to enable computation of changes in the network's response dueto perturbations in all system parameters, using the solution of a single set of appropriately constructed linear equations. The lower bound on speedup per learning iteration over conventional methodsfor calculating the neuromorphic energy gradient is O(N2), where N is the number of neurons in the network. 1 INTRODUCTION The biggest promise of artifcial neural networks as computational tools lies in the hope that they will enable fast processing and synthesis of complex information patterns. In particular, considerable efforts have recently been devoted to the formulation ofefficent methodologies for learning (e.g., Rumelhart et al., 1986; Pineda, 1988; Pearlmutter, 1989; Williams and Zipser, 1989; Barhen, Gulati and Zak, 1989). The development of learning algorithms is generally based upon the minimization of a neuromorphic energy function.


Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory with Holland's Genetic Algorithms

Neural Information Processing Systems

Kanerva's sparse distributed memory (SDM) is an associative-memory modelbased on the mathematical properties of high-dimensional binary address spaces. Holland's genetic algorithms are a search technique forhigh-dimensional spaces inspired by evolutionary processes of DNA. "Genetic Memory" is a hybrid of the above two systems, in which the memory uses a genetic algorithm to dynamically reconfigure itsphysical storage locations to reflect correlations between the stored addresses and data. For example, when presented with raw weather station data, the Genetic Memory discovers specific features inthe weather data which correlate well with upcoming rain, and reconfigures the memory to utilize this information effectively. This architecture is designed to maximize the ability of the system to scale-up to handle real-world problems.


Using a Translation-Invariant Neural Network to Diagnose Heart Arrhythmia

Neural Information Processing Systems

Distinctive electrocardiogram (EeG) patterns are created when the heart is beating normally and when a dangerous arrhythmia is present. Some devices which monitor the EeG and react to arrhythmias parameterize the ECG signal and make a diagnosis based on the parameters. The author discusses the use of a neural network to classify the EeG signals directly.


A self-organizing multiple-view representation of 3D objects

Neural Information Processing Systems

We demonstrate the ability of a two-layer network of thresholded summation units to support representation of 3D objects in which several distinct 2D views are stored for ea.ch object. Using unsupervised Hebbianrelaxation, the network learned to recognize ten objects from different viewpoints. The training process led to the emergence of compact representations of the specific input views. When tested on novel views of the same objects, the network exhibited asubstantial generalization capability. In simulated psychophysical experiments,the network's behavior was qualitatively similar to that of human subjects.