Goto

Collaborating Authors

 Neural Information Processing Systems


Phasor Neural Networks

Neural Information Processing Systems

ABSTRACT A novel network type is introduced which uses unit-length 2-vectors for local variables. As an example of its applications, associative memory nets are defined and their performance analyzed. Real systems corresponding to such'phasor' models can be e.g. INTRODUCTION Most neural network models use either binary local variables or scalars combined with sigmoidal nonlinearities. Rather awkward coding schemes have to be invoked if one wants to maintain linear relations between the local signals being processed in e.g.


Speech Recognition Experiments with Perceptrons

Neural Information Processing Systems

ABSTRACT Artificial neural networks (ANNs) are capable of accurate recognition of simple speech vocabularies such as isolated digits [1]. This paper looks at two more difficult vocabularies, the alphabetic E-set and a set of polysyllabic words. The E-set is difficult because it contains weak discriminants and polysyllables are difficult because of timing variation. Polysyllabic word recognition is aided by a time pre-alignment technique based on dynamic programming and E-set recognition is improved by focusing attention. Recognition accuracies are better than 98% for both vocabularies when implemented with a single layer perceptron.



Phase Transitions in Neural Networks

Neural Information Processing Systems

For related finite array models classical phase transi t.ions (which describe steady-state behavior) may not.


How Neural Nets Work

Neural Information Processing Systems

How Neural Nets Work Alan Lapedes Robert Farber Theoretical Division Los Alamos National Laboratory Los Alamos, NM 87545 Abstract: There is presently great interest in the abilities of neural networks to mimic "qualitative reasoning" by manipulating neural incodings of symbols. Less work has been performed on using neural networks to process floating point numbers and it is sometimes stated that neural networks are somehow inherently inaccurate and therefore best suited for "fuzzy" qualitative reasoning. Nevertheless, the potential speed of massively parallel operations make neural net "number crunching" an interesting topic to explore. In this paper we discuss some of our work in which we demonstrate that for certain applications neural networks can achieve significantly higher numerical accuracy than more conventional techniques. In particular, prediction of future values of a chaotic time series can be performed with exceptionally high accuracy. We analyze how a neural net is able to do this, and in the process show that a large class of functions from Rn. Rffl may be accurately approximated by a backpropagation neural net with just two "hidden" layers. The network uses this functional approximation to perform either interpolation (signal processing applications) or extrapolation (symbol processing applicationsJ.


Invariant Object Recognition Using a Distributed Associative Memory

Neural Information Processing Systems

This paper describes an approach to 2-dimensional object recognition. Complex-log conformal mapping is combined with a distributed associative memory to create a system which recognizes objects regardless of changes in rotation or scale. Recalled information from the memorized database is used to classify an object, reconstruct the memorized version of the object, and estimate the magnitude of changes in scale or rotation. The system response is resistant to moderate amounts of noise and occlusion. Several experiments, using real, gray scale images, are presented to show the feasibility of our approach. Introduction The challenge of the visual recognition problem stems from the fact that the projection of an object onto an image can be confounded by several dimensions of variability such as uncertain perspective, changing orientation and scale, sensor noise, occlusion, and nonuniform illumination.


Time-Sequential Self-Organization of Hierarchical Neural Networks

Neural Information Processing Systems

TIME-SEQUENTIAL SELF-ORGANIZATION OF HIERARCHICAL NEURAL NETWORKS Ronald H. Silverman Cornell University Medical College, New York, NY 10021 Andrew S. Noetzel polytechnic University, Brooklyn, NY 11201 ABSTRACT Self-organization of multi-layered networks can be realized by time-sequential organization of successive neural layers. Lateral inhibition operating in the surround of firing cells in each layer provides for unsupervised capture of excitation patterns presented by the previous layer. By presenting patterns of increasing complexity, in coordination with network selforganization, higher levels of the hierarchy capture concepts implicit in the pattern set. INTRODUCTION A fundamental difficulty in self-organization of hierarchical, multi-layered, networks of simple neuron-like cells is the determination of the direction of adjustment of synaptic link weights between neural layers not directly connected to input or output patterns. Several different approaches have been used to address this problem.


Encoding Geometric Invariances in Higher-Order Neural Networks

Neural Information Processing Systems

ENCODING GEOMETRIC INVARIANCES IN HIGHER-ORDER NEURAL NETWORKS C.L. Giles Air Force Office of Scientific Research, Bolling AFB, DC 20332 R.D. Griffin Naval Research Laboratory, Washington, DC 20375-5000 T. Maxwell Sachs-Freeman Associates, Landover, MD 20785 ABSTRACT We describe a method of constructing higher-order neural networks that respond invariantly under geometric transformations on the input space. By requiring each unit to satisfy a set of constraints on the interconnection weights, a particular structure is imposed on the network. A network built using such an architecture maintains its invariant performance independent of the values the weights assume, of the learning rules used, and of the form of the nonlinearities in the network. The invariance exhibited by a firstorder network is usually of a trivial sort, e.g., responding only to the average input in the case of translation invariance, whereas higher-order networks can perform useful functions and still exhibit the invariance. We derive the weight constraints for translation, rotation, scale, and several combinations of these transformations, and report results of simulation studies.


Basins of Attraction for Electronic Neural Networks

Neural Information Processing Systems

In a useful associative memory, an initial state should lead reliably to the "closest" memory. This requirement suggests that a well-behaved basin of attraction should evenly surround its attractor and have a smooth and regular shape. One dimensional basin maps plotting "pull in" probability against Hamming distance from an attract or do not reveal the shape of the basin in the high dimensional space of initial states9.


Neuromorphic Networks Based on Sparse Optical Orthogonal Codes

Neural Information Processing Systems

Synthetic neural nets[1,2] represent an active and growing research field. Fundamental issues, as well as practical implementations with electronic and optical devices are being studied. In addition, several learning algorithms have been studied, for example stochastically adaptive systems[3] based on many-body physics optimization concepts[4,5]. Signal processing in the optical domain has also been an active field of research. A wide variety of nonlinear all-optical devices are being studied, directed towards applications both in optical computating and in optical switching.