Goto

Collaborating Authors

 Technology


Neural Network Recognizer for Hand-Written Zip Code Digits

Neural Information Processing Systems

This paper describes the construction of a system that recognizes hand-printed digits, using a combination of classical techniques and neural-net methods. The system has been trained and tested on real-world data, derived from zip codes seen on actual U.S. Mail. The system rejects a small percentage of the examples as unclassifiable, and achieves a very low error rate on the remaining examples. The system compares favorably with other state-of-the art recognizers. While some of the methods are specific to this task, it is hoped that many of the techniques will be applicable to a wide range of recognition tasks.


Fixed Point Analysis for Recurrent Networks

Neural Information Processing Systems

This paper provides a systematic analysis of the recurrent backpropagation (RBP) algorithm, introducing a number of new results. The main limitation of the RBP algorithm is that it assumes the convergence of the network to a stable fixed point in order to backpropagate the error signals. We show by experiment and eigenvalue analysis that this condition can be violated and that chaotic behavior can be avoided. Next we examine the advantages of RBP over the standard backpropagation algorithm. RBP is shown to build stable fixed points corresponding to the input patterns. This makes it an appropriate tool for content addressable memories, one-to-many function learning, and inverse problems.


A Massively Parallel Self-Tuning Context-Free Parser

Neural Information Processing Systems

ABSTRACT The Parsing and Learning System(PALS) is a massively parallel self-tuning context-free parser. It is capable of parsing sentences of unbounded length mainly due to its parse-tree representation scheme. The system is capable of improving its parsing performance through the presentation of training examples. INTRODUCTION Recent PDP research[Rumelhart et al.- 1986; Feldman and Ballard, 1982; Lippmann, 1987] involving natural language processtng[Fanty, 1988; Selman, 1985; Waltz and Pollack, 1985] have unrealistically restricted sentences to a fixed length. A solution to this problem was presented in the system CONPARSE[Charniak and Santos.


Self Organizing Neural Networks for the Identification Problem

Neural Information Processing Systems

This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and demonstrates its use in a system identification task. The algorithm constructs the network, chooses the neuron functions, and adjusts the weights. It is compared to the Back-Propagation algorithm in the identification of the chaotic time series. The results shows that SONN constructs a simpler, more accurate model.


An Analog VLSI Chip for Thin-Plate Surface Interpolation

Neural Information Processing Systems

Reconstructing a surface from sparse sensory data is a well-known problem iIi computer vision. This paper describes an experimental analog VLSI chip for smooth surface interpolation from sparse depth data. An eight-node ID network was designed in 3J.lm CMOS and successfully tested.


Programmable Analog Pulse-Firing Neural Networks

Neural Information Processing Systems

ABSTRACT We describe pulse - stream firing integrated circuits that implement asynchronous analog neural networks. Synaptic weights are stored dynamically, and weighting uses time-division of the neural pulses from a signalling neuron to a receiving neuron. MOS transistors in their "ON" state act as variable resistors to control a capacitive discharge, and time-division is thus achieved by a small synapse circuit cell. The VLSI chip set design uses 2.5J.1.m INTRODUCTION Neural network implementations fall into two broad classes - digital [1,2] and analog (e.g. The strengths of a digital approach include the ability to use well-proven design techniques, high noise immunity, and the ability to implement programmable networks.



Constraints on Adaptive Networks for Modeling Human Generalization

Neural Information Processing Systems

CA 94305 ABSTRACT The potential of adaptive networks to learn categorization rules and to model human performance is studied by comparing how natural and artificial systems respond to new inputs, i.e., how they generalize. Like humans, networks can learn a detenninistic categorization task by a variety of alternative individual solutions. An analysis of the constraints imposed by using networks with the minimal number of hidden units shows that this "minimal configuration" constraint is not sufficient A further analysis of human and network generalizations indicates that initial conditions may provide important constraints on generalization. A new technique, which we call "reversed learning", is described for finding appropriate initial conditions. INTRODUCTION We are investigating the potential of adaptive networks to learn categorization tasks and to model human performance.


Heterogeneous Neural Networks for Adaptive Behavior in Dynamic Environments

Neural Information Processing Systems

This heterogeneity is crucial to the flexible generation of behavior which is essential for survival in a complex, dynamic environment. It may also provide powerful insights into the design of artificial neural networks. In this paper, we describe a heterogeneous neural network for controlling the wa1king of a simulated insect. This controller is inspired by the neuroethological and neurobiological literature on insect locomotion. It exhibits a variety of statically stable gaits at different speeds simply by varying the tonic activity of a single cell. It can also adapt to perturbations as a natural consequence of its design. INTRODUCTION Even very simple animals exhibit a dazzling variety of complex behaviors which they continuously adapt to the changing circumstances of their environment. Nervous systems evolved in order to generate appropriate behavior in dynamic, uncertain situations and thus insure the survival of the organisms containing them.


Computer Modeling of Associative Learning

Neural Information Processing Systems

The output of the model of the four-neuron network di sp 1ays changes in the temporal vari at i on of membrane potential similar to those observed in electrophysiological measurements.