Goto

Collaborating Authors

 Neural Information Processing Systems


Simulations Suggest Information Processing Roles for the Diverse Currents in Hippocampal Neurons

Neural Information Processing Systems

Lyle J. Borg-Graham Harvard-MIT Division of Health Sciences and Technology and Center for Biological Information Processing, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 ABSTRACT A computer model of the hippocampal pyramidal cell (HPC) is described which integrates data from a variety of sources in order to develop a consistent descriptionfor this cell type. The model presently includes descriptions of eleven nonlinear somatic currents of the HPC, and the electrotonic structure of the neuron is modelled with a soma/short-cable approximation. Model simulations qualitatively or quantitatively reproduce a wide range of somatic electrical behavior i HPCs, and demonstrate possible roles for the various currents in information processing. There are several substrates for neuronal computation, including connectivity, synapses,morphometries of dendritic trees, linear parameters of cell membrane, as well as nonlinear, time-varying membrane conductances, also referred to as currents or channels. In the classical description of neuronal function, the contribution of membrane channels is constrained to that of generating the action potential, setting firing threshold, and establishing the relationship between (steady-state) stimulus intensity and firing frequency.


Performance Measures for Associative Memories that Learn and Forget

Neural Information Processing Systems

The McCulloch/Pitts model discussed in [1] was one of the earliest neural network models to be analyzed. Some computational properties of what we call a Hopfield Associative Memory Network (HAMN):similar to the McCulloch/Pitts model was discussed by Hopfield in [2]. The HAMN can be measured quantitatively by defining and evaluating the information capacity as [2-6] have shown, but this network fails to exhibit more complex computational capabilities that neural network have due to its simplified structure. The HAMN belongs to a class of networks which we call static. In static networks the learning and recall procedures areseparate.


A 'Neural' Network that Learns to Play Backgammon

Neural Information Processing Systems

QUALITATIVERESULTS Analysis of the weights produced by training a network is an exceedingly difficult problem, which we have only been able to approach qualitatively. In Figure 1 we present a diagram showing the connection strengths in a network with 651 input units and no hidden units.


Synchronization in Neural Nets

Neural Information Processing Systems

This type of communication contrasts with that which is assumed in most other models which typically are continuous or discrete value-passing networks. Limiting the messages received by each processing unit to time markers that signal the firing of other units presents significant implemen tation advantages.


PARTITIONING OF SENSORY DATA BY A CORTICAL NETWORK

Neural Information Processing Systems

Two hundred layer IT cells are used with 100 input (LOT) lines and 200 collateral axons; both the LOT and collateral axons flow caudally. LOT axons connect with rostral dendrites with a probability of 0.2, which decreases linearly to 0.05 by the caudal end of the model. The connectivity is arranged randomly, subject to the constraint that the number of contacts for axons and dendrites is fixed within certain narrow b01llldaries (in the most severe case, each axon forms 20 synapses and each dendrite receives 20 contacts). The resulting matrix is thus hypergeometric in both dimensions. There are 20 simulated inhibitory interneurons, such that the layer IT cells are arranged in 20 overlapping patches, each within the influence of one such inhibitory cell.


Spatial Organization of Neural Networks: A Probabilistic Modeling Approach

Neural Information Processing Systems

ABSTRACT The aim of this paper is to explore the spatial organization of neural networks under Markovian assumptions, in what concerns the behaviour ofindividual cells and the interconnection mechanism. Spaceorganizational propertiesof neural nets are very relevant in image modeling and pattern analysis, where spatial computations on stochastic two-dimensionalimage fields are involved. As a first approach we develop a random neural network model, based upon simple probabilistic assumptions,whose organization is studied by means of discrete-event simulation.We then investigate the possibility of approXimating therandom network's behaviour by using an analytical approach originating from the theory of general product-form queueing networks. The neural network is described by an open network of nodes, inwhich customers moving from node to node represent stimulations andconnections between nodes are expressed in terms of suitably selectedrouting probabilities. We obtain the solution of the model under different disciplines affecting the time spent by a stimulation ateach node visited.


Computing Motion Using Resistive Networks

Neural Information Processing Systems

We open our eyes and we "see" the world in all its color, brightness, and movement. Yet, we have great difficulties when trying to endow our machines with similar abilities. In this paper we shall describe recent developments in the theory of early vision which lead from the formulation of the motion problem as an illposed oneto its solution by minimizing certain "cost" functions. These cost or energy functions can be mapped onto simple analog and digital resistive networks. Thus, we shall see how the optical flow can be computed by injecting currents into resistive networks and recording the resulting stationary voltage distribution at each node. These networks can be implemented in cMOS VLSI circuits and represent plausible candidates for biological vision systems. APERTURE PROBLEM AND SMOOTHNESS ASSUMPTION In this study, we use intensity-based schemes for recovering motion.


A Mean Field Theory of Layer IV of Visual Cortex and Its Application to Artificial Neural Networks

Neural Information Processing Systems

ABSTRACT A single cell theory for the development of selectivity and ocular dominance in visual cortex has been presented previously by Bienenstock, Cooper and Munrol. This has been extended to a network applicable to layer IV of visual cortex2 . In this paper we present a mean field approximation that captures in a fairly transparent manner the qualitative, and many of the quantitative, results of the network theory. Finally, we consider the application of this theory to artificial neural networks and show that a significant reduction in architectural complexity is possible. ASINGLE LAYER NETWORK AND THE MEAN FIELD APPROXIMATION We consider a receive signals from the layer (Figure 1).


Speech Recognition Experiments with Perceptrons

Neural Information Processing Systems

This paper looks at two more difficult vocabularies, the alphabetic E-set and a set of polysyllabic words. The E-set is difficult because it contains weak discriminants and polysyllables are difficult because of timing variation. Polysyllabic word recognition is aided by a time pre-alignment technique based on dynamic programming andE-set recognition is improved by focusing attention. Recognition accuracies are better than 98% for both vocabularies when implemented with a single layer perceptron. INTRODUCTION Artificial neural networks perform well on simple pattern recognition tasks.


Mathematical Analysis of Learning Behavior of Neuronal Models

Neural Information Processing Systems

Please address all further correspondence to: John Y. Cheung School of EECS 202 W. Boyd, CEC 219 Norman, OK 73019 (405)325-4721 November,1987 American Institute of Physics 1988 165 MATHEMATICAL ANALYSIS OF LEARNING BEHAVIOR OF NEURONAL MODELS John Y. Cheung and Massoud Omidvar School of Electrical Engineering and Computer Science ABSTRACT In this paper, we wish to analyze the convergence behavior of a number of neuronal plasticity models. Recent neurophysiological research suggests that the neuronal behavior is adaptive. In particular, memory stored within a neuron is associated with the synaptic weights which are varied or adjusted to achieve learning. A number of adaptive neuronal models have been proposed in the literature. Three specific models will be analyzed in this paper, specifically the Hebb model, the Sutton-Barto model, and the most recent trace model.