Goto

Collaborating Authors

 Information Technology


A Model of Distributed Sensorimotor Control in the Cockroach Escape Turn

Neural Information Processing Systems

In response to a puff of wind, the American cockroach turns away and runs. The circuit underlying the initial turn of this escape response consists of three populations of individually identifiable nerve cells and appears to employ distributedrepresentations in its operation. We have reconstructed several neuronal and behavioral properties of this system using simplified neural network models and the backpropagation learning algorithm constrained byknown structural characteristics of the circuitry. In order to test and refine the model, we have also compared the model's responses to various lesions with the insect's responses to similar lesions.


How Receptive Field Parameters Affect Neural Learning

Neural Information Processing Systems

Omohundro ICSI 1947 Center St., Suite 600 Berkeley, CA 94704 We identify the three principle factors affecting the performance of learning bynetworks with localized units: unit noise, sample density, and the structure of the target function. We then analyze the effect of unit receptive fieldparameters on these factors and use this analysis to propose a new learning algorithm which dynamically alters receptive field properties during learning.


Analog Computation at a Critical Point: A Novel Function for Neuronal Oscillations?

Neural Information Processing Systems

Static correlations amongspike trains obtained from simulations of large arrays of cells are in agreement with the predictions from these Hamiltonians, and dynamic correlat.ionsdisplay



EMPATH: Face, Emotion, and Gender Recognition Using Holons

Neural Information Processing Systems

The network is trained to simply reproduce its input, and so can as a nonlinear version of Kohonen's (1977) auto-associator. However it must through a narrow channel of hidden units, so it must extract regularities from the during learning. Empirical analysis of the trained network showed that the span the principal subspace of the image vectors, with some noise on the component due to network nonlinearity (Cottrell & Munro, 1988).


Back Propagation Implementation on the Adaptive Solutions CNAPS Neurocomputer Chip

Neural Information Processing Systems

An 8 chip configuration can train 2.3 billion connections per second and evaluate 9.6 billion BP feed forward connections per second.


Real-time autonomous robot navigation using VLSI neural networks

Neural Information Processing Systems

There have been very few demonstrations ofthe application ofVLSI neural networks to real world problems. Yet there are many signal processing, pattern recognition or optimization problems where a large number of competing hypotheses need to be explored in parallel, most often in real time. The massive parallelism of VLSI neural network devices, with one multiplier circuit per synapse, is ideally suited to such problems. In this paper, we present preliminary results from our design for a real time robot navigation system based on VLSI neural network modules. This is a - Also: RSRE, Great Malvern, Worcester, WR14 3PS 422 Real-time Autonomous Robot Navigation Using VLSI Neural Networks 423 real world problem which has not been fully solved by traditional AI methods; even when partial solutions have been proposed and implemented, these have required vast computational resources, usually remote from the robot and linked to it via an umbilical cord. 2 OVERVIEW The aim of our work is to develop an autonomous vehicle capable of real-time navigation, including obstacle avoidance, in a known indoor environment.


Integrated Segmentation and Recognition of Hand-Printed Numerals

Neural Information Processing Systems

Neural network algorithms have proven useful for recognition of individual, segmentedcharacters. However, their recognition accuracy has been limited by the accuracy of the underlying segmentation algorithm. Conventional, rule-basedsegmentation algorithms encounter difficulty if the characters are touching, broken, or noisy. The problem in these situations is that often one cannot properly segment a character until it is recognized yetone cannot properly recognize a character until it is segmented. We present here a neural network algorithm that simultaneously segments and recognizes in an integrated system. This algorithm has several novel features: it uses a supervised learning algorithm (backpropagation), but is able to take position-independent information as targets and self-organize the activities of the units in a competitive fashion to infer the positional information. We demonstrate this ability with overlapping hand-printed numerals.


Convergence of a Neural Network Classifier

Neural Information Processing Systems

In this paper, we prove that the vectors in the LVQ learning algorithm converge. We do this by showing that the learning algorithm performs stochastic approximation. Convergence is then obtained by identifying the appropriate conditions on the learning rate and on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in convergence of the LVQ error to the Bayesian optimal error as the appropriate parameters become large.


Statistical Mechanics of Temporal Association in Neural Networks

Neural Information Processing Systems

Basic computational functions of associative neural structures may be analytically studied within the framework of attractor neural networks where static patterns are stored as stable fixed-points for the system's dynamics. If the interactions between single neurons are instantaneous and mediated by symmetric couplings, there is a Lyapunov function for the retrieval dynamics (Hopfield 1982). The global computation correspondsin that case to a downhill motion in an energy landscape created by the stored information. Methods of equilibrium statistical mechanics may be applied andpermit a quantitative analysis of the asymptotic network behavior (Amit et al. 1985, 1987). The existence of a Lyapunov function is thus of great conceptual aswell as technical importance. Nevertheless, one should be aware that environmental inputs to a neural net always provide information in both space and time. It is therefore desirable to extend the original Hopfield scheme and to explore possibilities for a joint representation of static patterns and temporal associations.