Country
Analysis and Comparison of Different Learning Algorithms for Pattern Association Problems
ANALYSIS AND COMPARISON OF DIFFERENT LEARNING ALGORITHMS FOR PATTERN ASSOCIATION PROBLEMS J. Bernasconi Brown Boveri Research Center CH-S40S Baden, Switzerland ABSTRACT We investigate the behavior of different learning algorithms for networks of neuron-like units. As test cases we use simple pattern association problems, such as the XOR-problem and symmetry detection problems. The algorithms considered are either versions of the Boltzmann machine learning rule or based on the backpropagation of errors. We also propose and analyze a generalized delta rule for linear threshold units. We find that the performance of a given learning algorithm depends strongly on the type of units used.
Generalization of Back propagation to Recurrent and Higher Order Neural Networks
Fernando J. Pineda Applied Physics Laboratory, Johns Hopkins University Johns Hopkins Rd., Laurel MD 20707 Abstract A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is determined by dissipative differential equations. The error signal is backpropagated by integrating an associated differential equation. The method is introduced by applying it to the recurrent generalization of the feedforward backpropagation network. The method is extended to the case of higher order networks and to a constrained dynamical system for training a content addressable memory. The essential feature of the adaptive algorithms is that adaptive equation has a simple outer product form.
PARTITIONING OF SENSORY DATA BY A CORTICAL NETWORK
Granger, Richard, Ambros-Ingerson, Jose, Henry, Howard, Lynch, Gary
SUMMARY To process sensory data, sensory brain areas must preserve information about both the similarities and differences among learned cues: without the latter, acuity would be lost, whereas without the former, degraded versions of a cue would be erroneously thought to be distinct cues, and would not be recognized. We have constructed a model of piriform cortex incorporating a large number of biophysical, anatomical and physiological parameters, such as two-step excitatory firing thresholds, necessary and sufficient conditions for long-term potentiation (LTP) of synapses, three distinct types of inhibitory currents (short IPSPs, long hyperpolarizing currents (LHP) and long cellspecific afterhyperpolarization (AHP)), sparse connectivity between bulb and layer-II cortex, caudally-flowing excitatory collateral fibers, nonlinear dendritic summation, etc. We have tested the model for its ability to learn similarity-and difference-preserving encodings of incoming sensory cueSj the biological characteristics of the model enable it to produce multiple encodings of each input cue in such a way that different readouts of the cell firing activity of the model preserve both similarity and difference'information. In particular, probabilistic quantal transmitter-release properties of piriform synapses give rise to probabilistic postsynaptic voltage levels which, in combination with the activity of local patches of inhibitory interneurons in layer II, differentially select bursting vs. single-pulsing layer-II cells. Time-locked firing to the theta rhythm (Larson and Lynch, 1986) enables distinct spatial patterns to be read out against a relatively quiescent background firing rate. Training trials using the physiological rules for induction of LTP yield stable layer-II-cell spatial firing patterns for learned cues. Multiple simulated olfactory input patterns (Le., those that share many chemical features) will give rise to strongly-overlapping bulb firing patterns, activating many shared lateral olfactory tract (LOT) axons innervating layer Ia of piriform cortex, which in tum yields highly overlapping layer-II-cell excitatory potentials, enabling this spatial layer-II-cell encoding to preserve the overlap (similarity) among similar inputs. At the same time, those synapses that are enhanced by the learning process cause stronger cell firing, yielding strong, cell-specific afterhyperpolarizing (AHP) currents. Local inhibitory intemeurons effectively select alternate cells to fire once strongly-firing cells have undergone AHP. These alternate cells then activate their caudally-flowing recurrent collaterals, activating distinct populations of synapses in caudal layer lb.
Learning on a General Network
The network model considered consists of interconnected groups of neurons, where each group could be fully interconnected (it could have feedback connections, with possibly asymmetric weights), but no loops between the groups are allowed. A stochastic descent algorithm is applied, under a certain inequality constraint on each intragroup weight matrix which ensures for the network to possess a unique equilibrium state for every input. Introduction It has been shown in the last few years that large networks of interconnected "neuron" -like elemp.nts
A Dynamical Approach to Temporal Pattern Processing
Stornetta, W. Scott, Hogg, Tad, Huberman, Bernardo A.
W. Scott Stornetta Stanford University, Physics Department, Stanford, Ca., 94305 Tad Hogg and B. A. Huberman Xerox Palo Alto Research Center, Palo Alto, Ca. 94304 ABSTRACT Recognizing patterns with temporal context is important for such tasks as speech recognition, motion detection and signature verification. We propose an architecture in which time serves as its own representation, and temporal context is encoded in the state of the nodes. We contrast this with the approach of replicating portions of the architecture to represent time. As one example of these ideas, we demonstrate an architecture with capacitive inputs serving as temporal feature detectors in an otherwise standard back propagation model. Experiments involving motion detection and word discrimination serve to illustrate novel features of the system.
Presynaptic Neural Information Processing
ABSTRACT The potential for presynaptic information processing within the arbor of a single axon will be discussed in this paper. Current knowledge about the activity dependence of the firing threshold, the conditions required for conduction failure, and the similarity of nodes along a single axon will be reviewed. An electronic circuit model for a site of low conduction safety in an axon will be presented. In response to single frequency stimulation the electronic circuit acts as a lowpass filter. I. INTRODUCTION The axon is often modeled as a wire which imposes a fixed delay on a propagating signal.
Neural Net and Traditional Classifiers
Huang, William Y., Lippmann, Richard P.
Previous work on nets with continuous-valued inputs led to generative procedures to construct convex decision regions with two-layer perceptrons (one hidden layer) and arbitrary decision regions with three-layer perceptrons (two hidden layers). Here we demonstrate that two-layer perceptron classifiers trained with back propagation can form both convex and disjoint decision regions. Such classifiers are robust, train rapidly, and provide good performance with simple decision regions. When complex decision regions are required, however, convergence time can be excessively long and performance is often no better than that of k-nearest neighbor classifiers. Three neural net classifiers are presented that provide more rapid training under such situations. Two use fixed weights in the first one or two layers and are similar to classifiers that estimate probability density functions using histograms. A third "feature map classifier" uses both unsupervised and supervised training. It provides good performance with little supervised training in situations such as speech recognition where much unlabeled training data is available. The architecture of this classifier can be used to implement a neural net k-nearest neighbor classifier.