Goto

Collaborating Authors

Analysis and Comparison of Different Learning Algorithms for Pattern Association Problems

Neural Information Processing Systems

ANALYSIS AND COMPARISON OF DIFFERENT LEARNING ALGORITHMS FOR PATTERN ASSOCIATION PROBLEMS J. Bernasconi Brown Boveri Research Center CH-S40S Baden, Switzerland ABSTRACT We investigate the behavior of different learning algorithms for networks of neuron-like units. As test cases we use simple pattern association problems, such as the XOR-problem and symmetry detection problems. The algorithms considered are either versions of the Boltzmann machine learning rule or based on the backpropagation of errors. We also propose and analyze a generalized delta rule for linear threshold units. We find that the performance of a given learning algorithm depends strongly on the type of units used.


On the Power of Neural Networks for Solving Hard Problems

Neural Information Processing Systems

The neural network model is a discrete time system that can be represented by a weighted and undirected graph. There is a weight attached to each edge of the graph and a threshold value attached to each node (neuron) of the graph.


Presynaptic Neural Information Processing

Neural Information Processing Systems

ABSTRACT The potential for presynaptic information processing within the arbor of a single axon will be discussed in this paper. Current knowledge about the activity dependence of the firing threshold, the conditions required for conduction failure, and the similarity of nodes along a single axon will be reviewed. An electronic circuit model for a site of low conduction safety in an axon will be presented. In response to single frequency stimulation the electronic circuit acts as a lowpass filter. I. INTRODUCTION The axon is often modeled as a wire which imposes a fixed delay on a propagating signal.


New Hardware for Massive Neural Networks

Neural Information Processing Systems

ABSTRACT Transient phenomena associated with forward biased silicon p - n - n structures at 4.2K show remarkable similarities with biological neurons. The devices play a role similar to the two-terminal switching elements in Hodgkin-Huxley equivalent circuit diagrams. The devices provide simpler and more realistic neuron emulation than transistors or op-amps. They have such low power and current requirements that they could be used in massive neural networks. Some observed properties of simple circuits containing the devices include action potentials, refractory periods, threshold behavior, excitation, inhibition, summation over synaptic inputs, synaptic weights, temporal integration, memory, network connectivity modification based on experience, pacemaker activity, firing thresholds, coupling to sensors with graded signal outputs and the dependence of firing rate on input current. Transfer functions for simple artificial neurons with spiketrain inputs and spiketrain outputs have been measured and correlated with input coupling.


Network Generality, Training Required, and Precision Required

Neural Information Processing Systems

We show how to estimate (1) the number of functions that can be implemented by a particular network architecture, (2) how much analog precision is needed in the connections in the network, and (3) the number of training examples the network must see before it can be expected to form reliable generalizations.


The Sigmoid Nonlinearity in Prepyriform Cortex

Neural Information Processing Systems

THE SIGMOID NONLINEARITY IN PREPYRIFORM CORTEX Frank H. Eeckman University of California, Berkeley, CA 94720 ABSlRACT We report a study ·on the relationship between EEG amplitude values and unit spike output in the prepyriform cortex of awake and motivated rats. This relationship takes the form of a sigmoid curve, that describes normalized pulse-output for normalized wave input. The curve is fitted using nonlinear regression and is described by its slope and maximum value. Measurements were made for both excitatory and inhibitory neurons in the cortex. These neurons are known to form a monosynaptic negative feedback loop. Both classes of cells can be described by the same parameters.


Temporal Patterns of Activity in Neural Networks

Neural Information Processing Systems

Patterns of activity over real neural structures are known to exhibit timedependent behavior. It would seem that the brain may be capable of utilizing temporal behavior of activity in neural networks as a way of performing functions which cannot otherwise be easily implemented. These might include the origination of sequential behavior and the recognition of time-dependent stimuli. A model is presented here which uses neuronal populations with recurrent feedback connections in an attempt to observe and describe the resulting time-dependent behavior. Shortcomings and problems inherent to this model are discussed. Current models by other researchers are reviewed and their similarities and differences discussed.


Minkowski-r Back-Propagation: Learning in Connectionist Models with Non-Euclidian Error Signals

Neural Information Processing Systems

It can be shown that neural-like networks containing a single hidden layer of nonlinear activation units can learn to do a piece-wise linear partitioning of a feature space [2]. One result of such a partitioning is a complex gradient surface on which decisions about new input stimuli will be made. The generalization, categorization and clustering propenies of the network are therefore detennined by this mapping of input stimuli to this gradient swface in the output space. This gradient swface is a function of the conditional probability distributions of the output vectors given the input feature vectors as well as a function of the error relating the teacher signal and output.


Distributed Neural Information Processing in the Vestibulo-Ocular System

Neural Information Processing Systems

DISTRIBUTED NEURAL INFORMATION PROCESSING IN THE VESTIBULO-OCULAR SYSTEM Clifford Lau Office of Naval Research Detach ment Pasadena, CA 91106 Vicente Honrubia* UCLA Division of Head and Neck Surgery Los Angeles, CA 90024 ABSTRACT A new distributed neural information-processing model is proposed to explain the response characteristics of the vestibulo-ocular system and to reflect more accurately the latest anatomical and neurophysiological data on the vestibular afferent fibers and vestibular nuclei. In this model, head motion is sensed topographically by hair cells in the semicircular canals. Hair cell signals are then processed by multiple synapses in the primary afferent neurons which exhibit a continuum of varying dynamics. The model is an application of the concept of "multilayered" neural networks to the description of findings in the bullfrog vestibular nerve, and allows us to formulate mathematically the behavior of an assembly of neurons whose physiological characteristics vary according to their anatomical properties. INTRODUCTION Traditionally the physiological properties of individual vestibular afferent neurons have been modeled as a linear time-invariant system based on Steinhausents description of cupular motion.