Goto

Collaborating Authors

 Country


Distributed Neural Information Processing in the Vestibulo-Ocular System

Neural Information Processing Systems

DISTRIBUTED NEURAL INFORMATION PROCESSING IN THE VESTIBULO-OCULAR SYSTEM Clifford Lau Office of Naval Research Detach ment Pasadena, CA 91106 Vicente Honrubia* UCLA Division of Head and Neck Surgery Los Angeles, CA 90024 ABSTRACT A new distributed neural information-processing model is proposed to explain the response characteristics of the vestibulo-ocular system and to reflect more accurately the latest anatomical and neurophysiological data on the vestibular afferent fibers and vestibular nuclei. In this model, head motion is sensed topographically by hair cells in the semicircular canals. Hair cell signals are then processed by multiple synapses in the primary afferent neurons which exhibit a continuum of varying dynamics. The model is an application of the concept of "multilayered" neural networks to the description of findings in the bullfrog vestibular nerve, and allows us to formulate mathematically the behavior of an assembly of neurons whose physiological characteristics vary according to their anatomical properties. INTRODUCTION Traditionally the physiological properties of individual vestibular afferent neurons have been modeled as a linear time-invariant system based on Steinhausents description of cupular motion.


Presynaptic Neural Information Processing

Neural Information Processing Systems

ABSTRACT The potential for presynaptic information processing within the arbor of a single axon will be discussed in this paper. Current knowledge about the activity dependence of the firing threshold, the conditions required for conduction failure, and the similarity of nodes along a single axon will be reviewed. An electronic circuit model for a site of low conduction safety in an axon will be presented. In response to single frequency stimulation the electronic circuit acts as a lowpass filter. I. INTRODUCTION The axon is often modeled as a wire which imposes a fixed delay on a propagating signal.


Experimental Demonstrations of Optical Neural Computers

Neural Information Processing Systems

The high interconnectivity required by neural computers can be simply implemented in optics because channels for optical signals may be superimposed in three dimensions with little or no cross coupling. Since these channels may be formed holographically, optical neural systems can be designed to create and maintain interconnections very simply. Thus the optical system designer can to a large extent avoid the analytical and topological problems of determining individual interconnections for a given neural system and constructing physical paths for these interconnections. An archetypical design for a single layer of an optical neural computer is shown in Figure 1. Nonlinear thresholding elements, neurons, are arranged on two dimensional planes which are interconnected via the third dimension by holographic elements. The key concerns in implementing this design involve the need for suitable nonlinearities for the neural planes and high capacity, easily modifiable holographic elements. While it is possible to implement the neural function using entirely optical nonlinearities, for example using etalon arrays\ optoelectronic two dimensional spatial light modulators (2D SLMs) suitable for this purpose are more readily available.


Bit-Serial Neural Networks

Neural Information Processing Systems

This arises from the parallelism and distributed knowledge representation which gives rise to gentle degradation as faults appear. These functions are attractive to implementation in VLSI and WSI. For example, the natural fault - tolerance could be useful in silicon wafers with imperfect yield, where the network degradation is approximately proportional to the non-functioning silicon area. To cast neural networks in engineering language, a neuron is a state machine that is either "on" or "off', which in general assumes intermediate states as it switches smoothly between these extrema. The synapses weighting the signals from a transmitting neuron such that it is more or less excitatory or inhibitory to the receiving neuron. The set of synaptic weights determines the stable states and represents the learned information in a system. The neural state, VI' is related to the total neural activity stimulated by inputs to the neuron through an activation junction, F. Neural activity is the level of excitation of the neuron and the activation is the way it reacts in a response to a change in activation.



High Order Neural Networks for Efficient Associative Memory Design

Neural Information Processing Systems

We propose learning rules for recurrent neural networks with high-order interactions between some or all neurons. The designed networks exhibit the desired associative memory function: perfect storage and retrieval of pieces of information and/or sequences of information of any complexity.



A Computer Simulation of Olfactory Cortex with Functional Implications for Storage and Retrieval of Olfactory Information

Neural Information Processing Systems

A Computer Simulation of Olfactory Cortex With Functional Implications for Storage and Retrieval of Olfactory Information Matthew A. Wilson and James M. Bower Computation and Neural Systems Program Division of Biology, California Institute of Technology, Pasadena, CA 91125 ABSTRACT Based on anatomical and physiological data, we have developed a computer simulation of piriform (olfactory) cortex which is capable of reproducing spatial and temporal patterns of actual cortical activity under a variety of conditions. Using a simple Hebb-type learning rule in conjunction with the cortical dynamics which emerge from the anatomical and physiological organization of the model, the simulations are capable of establishing cortical representations for different input patterns. The basis of these representations lies in the interaction of sparsely distributed, highly divergent/convergent interconnections between modeled neurons. We have shown that different representations can be stored with minimal interference. Further, we have demonstrated that the degree of overlap of cortical representations for different stimuli can also be modulated. Both features are presumably important in classifying olfactory stimuli.


A Mean Field Theory of Layer IV of Visual Cortex and Its Application to Artificial Neural Networks

Neural Information Processing Systems

ABSTRACT A single cell theory for the development of selectivity and ocular dominance in visual cortex has been presented previously by Bienenstock, Cooper and Munrol. This has been extended to a network applicable to layer IV of visual cortex 2. In this paper we present a mean field approximation that captures in a fairly transparent manner the qualitative, and many of the quantitative, results of the network theory. Finally, we consider the application of this theory to artificial neural networks and show that a significant reduction in architectural complexity is possible. A SINGLE LAYER NETWORK AND THE MEAN FIELD APPROXIMATION We consider a single layer network of ideal neurons which receive signals from outside of the layer and from cells within the layer (Figure 1). The activity of the ith cell in the network is c' - m' d J d is a vector of afferent signals to the network. Each cell receives input from n fibers outside of the cortical network through the matrix of synapses mi' Intra-layer input to each cell is then transmitted through the matrix of cortico-cortical synapses L. Light circles are the LGN -cortical synapses.


Scaling Properties of Coarse-Coded Symbol Memories

Neural Information Processing Systems

DCPS' memory scheme is a modified version of the Random Receptors method [5]. The symbol space is the set of all triples over a 25 letter alphabet. Units have fixed-size receptive fields organized as 6 x 6 x 6 subspaces. Patterns are manipulated to minimize the variance in pattern size across symbols.