Country
Correlational Strength and Computational Algebra of Synaptic Connections Between Neurons
Correlational Strength and Computational Algebra of Synaptic Connections Between Neurons Eberhard E. Fetz Department of Physiology & Biophysics, University of Washington, Seattle, WA 98195 ABSTRACT Intracellular recordings in spinal cord motoneurons and cerebral cortex neurons have provided new evidence on the correlational strength of monosynaptic connections, and the relation between the shapes of postsynaptic potentials and the associated increased firing probability. In these cells, excitatory postsynaptic potentials (EPSPs) produce crosscorrelogram peakswhich resemble in large part the derivative of the EPSP. Additional synaptic noise broadens the peak, but the peak area -- i.e., the number of above-chance firings triggered per EPSP -- remains proportional to the EPSP amplitude. The consequences of these data for information processing by polysynaptic connections is discussed. The effects of sequential polysynaptic links can be calculated by convolving the effects of the underlying monosynaptic connections.
LEARNING BY STATE RECURRENCE DETECTION
Rosen, Bruce E., Goodwin, James M., Vidal, Jacques J.
The approach is applied both to Michie and Chambers BOXES algorithm and to Barto, Sutton and Anderson's extension, the ASE/ACE system, and has significantly improved the convergence rate of stochastically based learning automata. Recurrencelearning is a new nonlinear reward-penalty algorithm. It exploits information found during learning trials to reinforce decisions resulting in the recurrence of nonfailing states. Recurrence learning applies positive reinforcement during the exploration of the search space, whereas in the BOXES or ASE algorithms, only negative weight reinforcement is applied, and then only on failure. Simulation results show that the added information from recurrence learning increases the learning rate.
Encoding Geometric Invariances in Higher-Order Neural Networks
Giles, C. Lee, Griffin, R. D., Maxwell, T.
By requiring each unit to satisfy a set of constraints on the interconnection weights, a particular structure is imposed on the network. A network built using such an architecture maintains its invariant performance independent of the values the weights assume, of the learning rules used, and of the form of the nonlinearities in the network. The invariance exhibited by a firstorder networkis usually of a trivial sort, e.g., responding only to the average input in the case of translation invariance, whereas higher-order networks can perform useful functions and still exhibit the invariance. We derive the weight constraints for translation, rotation, scale, and several combinations of these transformations, and report results of simulation studies. INTRODUCTION A persistent difficulty for pattern recognition systems is the requirement that patterns or objects be recognized independent of irrelevant parameters or distortions such as orientation (position, rotation, aspect), scale or size, background or context, doppler shift, time of occurrence, or signal duration.
Invariant Object Recognition Using a Distributed Associative Memory
Wechsler, Harry, Zimmerman, George Lee
Invariant Object Recognition Using a Distributed Associative Memory Harry Wechsler and George Lee Zimmerman Department or Electrical Engineering University or Minnesota Minneapolis, MN 55455 Abstract This paper describes an approach to 2-dimensional object recognition. Complex-log conformal mappingis combined with a distributed associative memory to create a system which recognizes objects regardless of changes in rotation or scale. Recalled information from the memorized database is used to classify an object, reconstruct the memorized version ofthe object, and estimate the magnitude of changes in scale or rotation. The system response is resistant to moderate amounts of noise and occlusion. Several experiments, using real,gray scale images, are presented to show the feasibility of our approach. Introduction The challenge of the visual recognition problem stems from the fact that the projection of an object onto an image can be confounded by several dimensions of variability such as uncertain perspective, changing orientation and scale, sensor noise, occlusion, and nonuniform illumination.
The Capacity of the Kanerva Associative Memory is Exponential
CA 94305 ABSTRACT The capacity of an associative memory is defined as the maximum number of vords that can be stored and retrieved reliably by an address vithin a given sphere of attraction. It is shown by sphere packing arguments that as the address length increases. This exponential grovth in capacity can actually be achieved by the Kanerva associative memory. Formulas for these op.timal values are provided. The exponential grovth in capacity for the Kanerva associative memory contrasts sharply vith the sub-linear grovth in capacity for the Hopfield associative memory.
New Hardware for Massive Neural Networks
Coon, Darryl D., Perera, A. G. Unil
ABSTRACT Transient phenomena associated with forward biased silicon p - n - n structures at 4.2K show remarkable similarities with biological neurons. The devices play a role similar to the two-terminal switching elements in Hodgkin-Huxley equivalent circuit diagrams. The devices provide simpler and more realistic neuron emulation than transistors or op-amps. They have such low power and current requirements that they could be used in massive neural networks. Some observed properties of simple circuits containing the devices include action potentials, refractory periods, threshold behavior, excitation, inhibition, summation over synaptic inputs, synaptic weights, temporal integration, memory, network connectivity modification based on experience, pacemaker activity, firing thresholds, coupling to sensors with graded signal outputsand the dependence of firing rate on input current.
A Computer Simulation of Olfactory Cortex with Functional Implications for Storage and Retrieval of Olfactory Information
Bower, James M., Wilson, Matthew A.
Using a simple Hebb-type learning rule in conjunction withthe cortical dynamics which emerge from the anatomical and physiological organization ofthe model, the simulations are capable of establishing cortical representations for different input patterns. The basis of these representations lies in the interaction of sparsely distributed, highly divergent/convergent interconnections between modeled neurons. We have shown that different representations can be stored with minimal interference.