Country
A Computer Simulation of Olfactory Cortex with Functional Implications for Storage and Retrieval of Olfactory Information
Bower, James M., Wilson, Matthew A.
A Computer Simulation of Olfactory Cortex With Functional Implications for Storage and Retrieval of Olfactory Information Matthew A. Wilson and James M. Bower Computation and Neural Systems Program Division of Biology, California Institute of Technology, Pasadena, CA 91125 ABSTRACT Based on anatomical and physiological data, we have developed a computer simulation of piriform (olfactory) cortex which is capable of reproducing spatial and temporal patterns of actual cortical activity under a variety of conditions. Using a simple Hebb-type learning rule in conjunction with the cortical dynamics which emerge from the anatomical and physiological organization of the model, the simulations are capable of establishing cortical representations for different input patterns. The basis of these representations lies in the interaction of sparsely distributed, highly divergent/convergent interconnections between modeled neurons. We have shown that different representations can be stored with minimal interference. Further, we have demonstrated that the degree of overlap of cortical representations for different stimuli can also be modulated. Both features are presumably important in classifying olfactory stimuli.
A Mean Field Theory of Layer IV of Visual Cortex and Its Application to Artificial Neural Networks
ABSTRACT A single cell theory for the development of selectivity and ocular dominance in visual cortex has been presented previously by Bienenstock, Cooper and Munrol. This has been extended to a network applicable to layer IV of visual cortex 2. In this paper we present a mean field approximation that captures in a fairly transparent manner the qualitative, and many of the quantitative, results of the network theory. Finally, we consider the application of this theory to artificial neural networks and show that a significant reduction in architectural complexity is possible. A SINGLE LAYER NETWORK AND THE MEAN FIELD APPROXIMATION We consider a single layer network of ideal neurons which receive signals from outside of the layer and from cells within the layer (Figure 1). The activity of the ith cell in the network is c' - m' d J d is a vector of afferent signals to the network. Each cell receives input from n fibers outside of the cortical network through the matrix of synapses mi' Intra-layer input to each cell is then transmitted through the matrix of cortico-cortical synapses L. Light circles are the LGN -cortical synapses.
Scaling Properties of Coarse-Coded Symbol Memories
Rosenfeld, Ronald, Touretzky, David S.
DCPS' memory scheme is a modified version of the Random Receptors method [5]. The symbol space is the set of all triples over a 25 letter alphabet. Units have fixed-size receptive fields organized as 6 x 6 x 6 subspaces. Patterns are manipulated to minimize the variance in pattern size across symbols.
Generalization of Back propagation to Recurrent and Higher Order Neural Networks
Fernando J. Pineda Applied Physics Laboratory, Johns Hopkins University Johns Hopkins Rd., Laurel MD 20707 Abstract A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is determined by dissipative differential equations. The error signal is backpropagated by integrating an associated differential equation. The method is introduced by applying it to the recurrent generalization of the feedforward backpropagation network. The method is extended to the case of higher order networks and to a constrained dynamical system for training a content addressable memory. The essential feature of the adaptive algorithms is that adaptive equation has a simple outer product form.
Capacity for Patterns and Sequences in Kanerva's SDM as Compared to Other Associative Memory Models
ABSTRACT The information capacity of Kanerva's Sparse, Distributed Memory (SDM) and Hopfield-type neural networks is investigated. Under the approximations used here, it is shown that the total information stored in these systems is proportional to the number connections in the network. The proportionality constant is the same for the SDM and HopJreld-type models independent of the particular model, or the order of the model. The approximations are checked numerically. This same analysis can be used to show that the SDM can store sequences of spatiotemporal patterns, and the addition of time-delayed connections allows the retrieval of context dependent temporal patterns. A minor modification of the SDM can be used to store correlated patterns. INTRODUCTION Many different models of memory and thought have been proposed by scientists over the years. The learning rule considered here uses the outer-product of patterns of Is and -Is.
A Mean Field Theory of Layer IV of Visual Cortex and Its Application to Artificial Neural Networks
ABSTRACT A single cell theory for the development of selectivity and ocular dominance in visual cortex has been presented previously by Bienenstock, Cooper and Munrol. This has been extended to a network applicable to layer IV of visual cortex 2. In this paper we present a mean field approximation that captures in a fairly transparent manner the qualitative, and many of the quantitative, results of the network theory. Finally, we consider the application of this theory to artificial neural networks and show that a significant reduction in architectural complexity is possible. A SINGLE LAYER NETWORK AND THE MEAN FIELD APPROXIMATION We consider a single layer network of ideal neurons which receive signals from outside of the layer and from cells within the layer (Figure 1). The activity of the ith cell in the network is c' - m' d J d is a vector of afferent signals to the network. Each cell receives input from n fibers outside of the cortical network through the matrix of synapses mi' Intra-layer input to each cell is then transmitted through the matrix of cortico-cortical synapses L. Light circles are the LGN -cortical synapses.
Simulations Suggest Information Processing Roles for the Diverse Currents in Hippocampal Neurons
ABSTRACT A computer model of the hippocampal pyramidal cell (HPC) is described which integrates data from a variety of sources in order to develop a consistent description for this cell type. The model presently includes descriptions of eleven nonlinear somatic currents of the HPC, and the electrotonic structure of the neuron is modelled with a soma/short-cable approximation. Model simulations qualitatively or quantitatively reproduce a wide range of somatic electrical behavior i HPCs, and demonstrate possible roles for the various currents in information processing. There are several substrates for neuronal computation, including connectivity, synapses, morphometries of dendritic trees, linear parameters of cell membrane, as well as nonlinear, time-varying membrane conductances, also referred to as currents or channels. In the classical description of neuronal function, the contribution of membrane channels is constrained to that of generating the action potential, setting firing threshold, and establishing the relationship between (steady-state) stimulus intensity and firing frequency.
Programmable Synaptic Chip for Electronic Neural Networks
Moopenn, Alexander, Langenbacher, H., Thakoor, A. P., Khanna, S. K.
PROGRAMMABLE SYNAPTIC CHIP FOR ELECTRONIC NEURAL NETWORKS A. Moopenn, H. Langenbacher, A.P. Thakoor, and S.K. Khanna Jet Propulsion Laboratory California Institute of Technology Pasadena, CA 91009 ABSTRACT A binary synaptic matrix chip has been developed for electronic neural networks. The matrix chip contains a programmable 32X32 array of "long channel" NMOSFET binary connection elements implemented in a 3-um bulk CMOS process. Since the neurons are kept offchip, the synaptic chip serves as a "cascadable" building block for a multi-chip synaptic network as large as 512X512 in size. As an alternative to the programmable NMOSFET (long channel) connection elements, tailored thin film resistors are deposited, in series with FET switches, on some CMOS test chips, to obtain the weak synaptic connections. Although deposition and patterning of the resistors require additional processing steps, they promise substantial savings in silcon area.
The Capacity of the Kanerva Associative Memory is Exponential
THE CAPACITY OF THE KANERVA ASSOCIATIVE MEMORY IS EXPONENTIAL P. A. Chou CA 94305 ABSTRACT The capacity of an associative memory is defined as the maximum number of vords that can be stored and retrieved reliably by an address vithin a given sphere of attraction. It is shown by sphere packing arguments that as the address length increases. This exponential grovth in capacity can actually be achieved by the Kanerva associative memory. Formulas for these op.timal values are provided. The exponential grovth in capacity for the Kanerva associative memory contrasts sharply vith the sub-linear grovth in capacity for the Hopfield associative memory.