Goto

Collaborating Authors

 Industry


Knowledge-Based Environments for Teaching and Learning

AI Magazine

The Spring Symposium on Knowledge-based Environments for Teaching and Learning focused on the use of technology to facilitate learning, training, teaching, counseling, coaxing and coaching. Sixty participants from academia and industry assessed progress made to date and speculated on new tools for building second generation systems.


Second International Workshop on User Modeling

AI Magazine

The Second International Workshop on User Modeling was held March 30- April 1, 1990 in Honolulu, Hawaii. The general chairperson was Dr. Wolfgang Wahlster of the University of Saarbrucken; the program and local arrangements chairperson was Dr. David Chin of the University of Hawaii at Manoa. The workshop was sponsored by AAAI and the University of Hawaii, with AAAI providing eight travel stipends for students.


Knowledge Discovery in Real Databases: A Report on the IJCAI-89 Workshop

AI Magazine

The growth in the amount of available databases far outstrips the growth of corresponding knowledge. This creates both a need and an opportunity for extracting knowledge from databases. Many recent results have been reported on extracting different kinds of knowledge from databases, including diagnostic rules, drug side effects, classes of stars, rules for expert systems, and rules for semantic query optimization.


Adjoint Operator Algorithms for Faster Learning in Dynamical Neural Networks

Neural Information Processing Systems

A methodology for faster supervised learning in dynamical nonlinear neuralnetworks is presented. It exploits the concept of adjoint operntors to enable computation of changes in the network's response dueto perturbations in all system parameters, using the solution of a single set of appropriately constructed linear equations. The lower bound on speedup per learning iteration over conventional methodsfor calculating the neuromorphic energy gradient is O(N2), where N is the number of neurons in the network. 1 INTRODUCTION The biggest promise of artifcial neural networks as computational tools lies in the hope that they will enable fast processing and synthesis of complex information patterns. In particular, considerable efforts have recently been devoted to the formulation ofefficent methodologies for learning (e.g., Rumelhart et al., 1986; Pineda, 1988; Pearlmutter, 1989; Williams and Zipser, 1989; Barhen, Gulati and Zak, 1989). The development of learning algorithms is generally based upon the minimization of a neuromorphic energy function.


Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory with Holland's Genetic Algorithms

Neural Information Processing Systems

Kanerva's sparse distributed memory (SDM) is an associative-memory modelbased on the mathematical properties of high-dimensional binary address spaces. Holland's genetic algorithms are a search technique forhigh-dimensional spaces inspired by evolutionary processes of DNA. "Genetic Memory" is a hybrid of the above two systems, in which the memory uses a genetic algorithm to dynamically reconfigure itsphysical storage locations to reflect correlations between the stored addresses and data. For example, when presented with raw weather station data, the Genetic Memory discovers specific features inthe weather data which correlate well with upcoming rain, and reconfigures the memory to utilize this information effectively. This architecture is designed to maximize the ability of the system to scale-up to handle real-world problems.


Using a Translation-Invariant Neural Network to Diagnose Heart Arrhythmia

Neural Information Processing Systems

Distinctive electrocardiogram (EeG) patterns are created when the heart is beating normally and when a dangerous arrhythmia is present. Some devices which monitor the EeG and react to arrhythmias parameterize the ECG signal and make a diagnosis based on the parameters. The author discusses the use of a neural network to classify the EeG signals directly.



Unsupervised Learning in Neurodynamics Using the Phase Velocity Field Approach

Neural Information Processing Systems

A new concept for unsupervised learning based upon examples introduced tothe neural network is proposed. Each example is considered as an interpolation node of the velocity field in the phase space. The velocities at these nodes are selected such that all the streamlines converge to an attracting set imbedded in the subspace occupied by the cluster of examples. The synaptic interconnections are found from learning procedure providing selected field. The theory is illustrated by examples. This paper is devoted to development of a new concept for unsupervised learning based upon examples introduced to an artificial neural network.


The Computation of Sound Source Elevation in the Barn Owl

Neural Information Processing Systems

The midbrain of the barn owl contains a map-like representation of sound source direction which is used to precisely orient the head toward targetsof interest. Elevation is computed from the interaural difference in sound level. We present models and computer simulations oftwo stages of level difference processing which qualitatively agree with known anatomy and physiology, and make several striking predictions. 1 INTRODUCTION


Learning in Higher-Order "Artificial Dendritic Trees

Neural Information Processing Systems

The computational territory between the linearly summing McCulloch-Pitts neuron and the nonlinear differential equations of Hodgkin & Huxley is relatively sparsely populated. Connectionistsuse variants of the former and computational neuroscientists struggle with the exploding parameter spaces provided by the latter. However, evidence frombiophysical simulations suggests that the voltage transfer properties of synapses, spines and dendritic membranes involve many detailed nonlinear interactions, notjust a squashing function at the cell body. Real neurons may indeed be higher-order nets. For the computationally-minded, higher order interactions means, first of all, quadratic terms.