Technology
A Massively Parallel Self-Tuning Context-Free Parser
ABSTRACT The Parsing and Learning System(PALS) is a massively parallel self-tuning context-free parser. It is capable of parsing sentences of unbounded length mainly due to its parse-tree representation scheme. The system is capable of improving its parsing performance through the presentation of training examples. INTRODUCTION Recent PDP research[Rumelhart et al.- 1986; Feldman and Ballard, 1982; Lippmann, 1987] involving natural language processtng[Fanty, 1988; Selman, 1985; Waltz and Pollack, 1985] have unrealistically restricted sentences to a fixed length. A solution to this problem was presented in the system CONPARSE[Charniak and Santos.
Self Organizing Neural Networks for the Identification Problem
Tenorio, Manoel Fernando, Lee, Wei-Tsih
This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and demonstrates its use in a system identification task. The algorithm constructs the network, chooses the neuron functions, and adjusts the weights. It is compared to the Back-Propagation algorithm in the identification of the chaotic time series. The results shows that SONN constructs a simpler, more accurate model.
Programmable Analog Pulse-Firing Neural Networks
Hamilton, Alister, Murray, Alan F., Tarassenko, Lionel
ABSTRACT We describe pulse - stream firing integrated circuits that implement asynchronous analog neural networks. Synaptic weights are stored dynamically, and weighting uses time-division of the neural pulses from a signalling neuron to a receiving neuron. MOS transistors in their "ON" state act as variable resistors to control a capacitive discharge, and time-division is thus achieved by a small synapse circuit cell. The VLSI chip set design uses 2.5J.1.m INTRODUCTION Neural network implementations fall into two broad classes - digital [1,2] and analog (e.g. The strengths of a digital approach include the ability to use well-proven design techniques, high noise immunity, and the ability to implement programmable networks.
Constraints on Adaptive Networks for Modeling Human Generalization
Gluck, Mark A., Pavel, M., Henkle, Van
CA 94305 ABSTRACT The potential of adaptive networks to learn categorization rules and to model human performance is studied by comparing how natural and artificial systems respond to new inputs, i.e., how they generalize. Like humans, networks can learn a detenninistic categorization task by a variety of alternative individual solutions. An analysis of the constraints imposed by using networks with the minimal number of hidden units shows that this "minimal configuration" constraint is not sufficient A further analysis of human and network generalizations indicates that initial conditions may provide important constraints on generalization. A new technique, which we call "reversed learning", is described for finding appropriate initial conditions. INTRODUCTION We are investigating the potential of adaptive networks to learn categorization tasks and to model human performance.
Heterogeneous Neural Networks for Adaptive Behavior in Dynamic Environments
Beer, Randall D., Chiel, Hillel J., Sterling, Leon S.
This heterogeneity is crucial to the flexible generation of behavior which is essential for survival in a complex, dynamic environment. It may also provide powerful insights into the design of artificial neural networks. In this paper, we describe a heterogeneous neural network for controlling the wa1king of a simulated insect. This controller is inspired by the neuroethological and neurobiological literature on insect locomotion. It exhibits a variety of statically stable gaits at different speeds simply by varying the tonic activity of a single cell. It can also adapt to perturbations as a natural consequence of its design. INTRODUCTION Even very simple animals exhibit a dazzling variety of complex behaviors which they continuously adapt to the changing circumstances of their environment. Nervous systems evolved in order to generate appropriate behavior in dynamic, uncertain situations and thus insure the survival of the organisms containing them.
Neural Analog Diffusion-Enhancement Layer and Spatio-Temporal Grouping in Early Vision
Waxman, Allen M., Seibert, Michael, Cunningham, Robert K., Wu, Jian
A new class of neural network aimed at early visual processing is described; we call it a Neural Analog Diffusion-Enhancement Layer or "NADEL." The network consists of two levels which are coupled through feedfoward and shunted feedback connections. The lower level is a two-dimensional diffusion map which accepts visual features as input, and spreads activity over larger scales as a function of time. The upper layer is periodically fed the activity from the diffusion layer and locates local maxima in it (an extreme form of contrast enhancement) using a network of local comparators. These local maxima are fed back to the diffusion layer using an on-center/off-surround shunting anatomy. The maxima are also available as output of the network. The network dynamics serves to cluster features on multiple scales as a function of time, and can be used in a variety of early visual processing tasks such as: extraction of comers and high curvature points along edge contours, line end detection, gap filling in contours, generation of fixation points, perceptual grouping on multiple scales, correspondence and path impletion in long-range apparent motion, and building 2-D shape representations that are invariant to location, orientation, scale, and small deformation on the visual field.