Goto

Collaborating Authors

 Country


Contour-Map Encoding of Shape for Early Vision

Neural Information Processing Systems

Pentti Kanerva Research Institute for Advanced Computer Science Mail Stop 230-5, NASA Ames Research Center Moffett Field, California 94035 ABSTRACT Contour maps provide a general method for recognizing two-dimensional shapes. All but blank images give rise to such maps, and people are good at recognizing objects and shapes from them. The maps are encoded easily in long feature vectors that are suitable for recognition by an associative memory. These properties of contour maps suggest a role for them in early visual perception. The prevalence of direction-sensitive neurons in the visual cortex of mammals supports this view.


Learning in Higher-Order "Artificial Dendritic Trees

Neural Information Processing Systems

The computational territory between the linearly summing McCulloch-Pitts neuron and the nonlinear differential equations of Hodgkin & Huxley is relatively sparsely populated. Connectionists use variants of the former and computational neuroscientists struggle with the exploding parameter spaces provided by the latter. However, evidence from biophysical simulations suggests that the voltage transfer properties of synapses, spines and dendritic membranes involve many detailed nonlinear interactions, not just a squashing function at the cell body. Real neurons may indeed be higher-order nets. For the computationally-minded, higher order interactions means, first of all, quadratic terms. This contribution presents a simple learning principle for a binary tree with a logistic/quadratic transfer function at each node. These functions, though highly nested, are shown to be capable of changing their shape in concert. The resulting tree structure receives inputs at its leaves, and outputs an estimate of the probability that the input pattern is a member of one of two classes at the top.


Neural Network Weight Matrix Synthesis Using Optimal Control Techniques

Neural Information Processing Systems

Given a set of input-output training samples, we describe a procedure for determining the time sequence of weights for a dynamic neural network to model an arbitrary input-output process. We formulate the input-output mapping problem as an optimal control problem, defining a performance index to be minimized as a function of time-varying weights. We solve the resulting nonlinear two-point-boundary-value problem, and this yields the training rule. For the performance index chosen, this rule turns out to be a continuous time generalization of the outer product rule earlier suggested heuristically by Hopfield for designing associative memories. Learning curves for the new technique are presented.


A Systematic Study of the Input/Output Properties of a 2 Compartment Model Neuron With Active Membranes

Neural Information Processing Systems

The input/output properties of a 2 compartment model neuron are systematically explored. Taken from the work of MacGregor (MacGregor, 1987), the model neuron compartments contain several active conductances, including a potassium conductance in the dendritic compartment driven by the accumulation of intradendritic calcium. Dynamics of the conductances and potentials are governed by a set of coupled first order differential equations which are integrated numerically. There are a set of 17 internal parameters to this model, specificying conductance rate constants, time constants, thresholds, etc. To study parameter sensitivity, a set of trials were run in which the input driving the neuron is kept fixed while each internal parameter is varied with all others left fixed. To study the input/output relation, the input to the dendrite (a square wave) was varied (in frequency and magnitude) while all internal parameters of the system were left flXed, and the resulting output firing rate and bursting rate was counted. The input/output relation of the model neuron studied turns out to be much more sensitive to modulation of certain dendritic potassium current parameters than to plasticity of synapse efficacy per se (the amount of current influx due to synapse activation). This would in turn suggest, as has been recently observed experimentally, that the potassium current may be as or more important a focus of neural plasticity than synaptic efficacy.



Neuronal Group Selection Theory: A Grounding in Robotics

Neural Information Processing Systems

In this paper, we discuss a current attempt at applying the organizational principle Edelman calls Neuronal Group Selection to the control of a real, two-link robotic manipulator. We begin by motivating the need for an alternative to the position-control paradigm of classical robotics, and suggest that a possible avenue is to look at the primitive animal limb'neurologically ballistic' control mode. We have been considering a selectionist approach to coordinating a simple perception-action task. 1 MOTIVATION The majority of industrial robots in the world are mechanical manipUlators - often armlike devices consisting of some number of rigid links with actuators mounted where the links join that move adjacent links relative to each other, rotationally or translation ally. At the joints there are typically also sensors measuring the relative position of adjacent links, and it is in terms of position that manipulators are generally controlled (a desired motion is specified as a desired position of the end effector, from which can be derived the necessary positions of the links comprising the manipulator). Position control dominates largely for historical reasons, rooted in bang-bang control: manipulators bumped between mechanical stops placed so as to enforce a desired trajectory for the end effector.


A Neural Network for Real-Time Signal Processing

Neural Information Processing Systems

This paper describes a neural network algorithm that (1) performs temporal pattern matching in real-time, (2) is trained online, with a single pass, (3) requires only a single template for training of each representative class, (4) is continuously adaptable to changes in background noise, (5) deals with transient signals having low signalto-noise ratios, (6) works in the presence of non-Gaussian noise, (7) makes use of context dependencies and (8) outputs Bayesian probability estimates. The algorithm has been adapted to the problem of passive sonar signal detection and classification. It runs on a Connection Machine and correctly classifies, within 500 ms of onset, signals embedded in noise and subject to considerable uncertainty. 1 INTRODUCTION This paper describes a neural network algorithm, STOCHASM, that was developed for the purpose of real-time signal detection and classification. Of prime concern was capability for dealing with transient signals having low signal-to-noise ratios (SNR). The algorithm was first developed in 1986 for real-time fault detection and diagnosis of malfunctions in ship gas turbine propulsion systems (Malkoff, 1987). It subsequently was adapted for passive sonar signal detection and classification. Recently, versions for information fusion and radar classification have been developed.


The Computation of Sound Source Elevation in the Barn Owl

Neural Information Processing Systems

The midbrain of the barn owl contains a map-like representation of sound source direction which is used to precisely orient the head toward targets of interest. Elevation is computed from the interaural difference in sound level. We present models and computer simulations of two stages of level difference processing which qualitatively agree with known anatomy and physiology, and make several striking predictions. 1 INTRODUCTION


An Analog VLSI Model of Adaptation in the Vestibulo-Ocular Reflex

Neural Information Processing Systems

The vestibulo-ocular reflex (VOR) is the primary mechanism that controls the compensatory eye movements that stabilize retinal images during rapid head motion. The primary pathways of this system are feed-forward, with inputs from the semicircular canals and outputs to the oculomotor system. Since visual feedback is not used directly in the VOR computation, the system must exploit motor learning to perform correctly. Lisberger(1988) has proposed a model for adapting the VOR gain using image-slip information from the retina. We have designed and tested analog very largescale integrated (VLSI) circuitry that implements a simplified version of Lisberger's adaptive VOR model.


Performance Comparisons Between Backpropagation Networks and Classification Trees on Three Real-World Applications

Neural Information Processing Systems

In this paper we compare regression and classification systems. A regression system can generate an output f for an input X, where both X and f are continuous and, perhaps, multidimensional. A classification system can generate an output class, C, for an input X, where X is continuous and multidimensional and C is a member of a finite alphabet. The statistical technique of Classification And Regression Trees (CART) was developed during the years 1973 (Meisel and Michalpoulos) through 1984 (Breiman el al).