Plotting

Cholinergic Modulation May Enhance Cortical Associative Memory Function

Neural Information Processing Systems

James M. Bower Computation and Neural Systems Caltech 216-76 Pasadena, CA 91125 Combining neuropharmacological experiments with computational modeling, wehave shown that cholinergic modulation may enhance associative memory function in piriform (olfactory) cortex. We have shown that the acetylcholine analogue carbachol selectively suppresses synaptic transmission betweencells within piriform cortex, while leaving input connections unaffected. When tested in a computational model of piriform cortex, this selective suppression, applied during learning, enhances associative memory performance.


Qualitative structure from motion

Neural Information Processing Systems

I have presented a qualitative approach to the problem of recovering object structure from motion information and discussed some of its computational, psychophysical and implementational aspects. The computation of qualitative shape, as represented bythe sign of the Gaussian curvature, can be performed by a field of simple operators, in parallel over the entire image. The performance of a qualitative shape detection module, implemented by an artificial neural network, appears to be similar to the performance of human subjects in an identical task.


RecNorm: Simultaneous Normalisation and Classification applied to Speech Recognition

Neural Information Processing Systems

A particular form of neural network is described, which has terminals for acoustic patterns, class labels and speaker parameters. A method of training this network to "tune in" the speaker parameters to a particular speaker is outlined, based on a trick for converting a supervised network to an unsupervised mode. We describe experiments using this approach in isolated word recognition based on whole-word hidden Markov models. The results indicate an improvement over speaker-independent performance and,for unlabelled data, a performance close to that achieved on labelled data. 1 INTRODUCTION We are concerned to emulate some aspects of perception. In particular, the way that a stimulus which is ambiguous, perhaps because of unknown lighting conditions, can become unambiguous in the context of other such stimuli: the fact that they are subject to tbe same unknown conditions gives our perceptual apparatus enough constraints to solve tbe problem.


VLSI Implementation of TInMANN

Neural Information Processing Systems

A massively parallel, all-digital, stochastic architecture - TlnMAN N - is described which performs competitive and Kohonen types of learning. A VLSI design is shown for a TlnMANN neuron which fits within a small, inexpensive MOSIS TinyChip frame, yet which can be used to build larger networks of several hundred neurons. The neuron operates at a speed of 15 MHz which allows the network to process 290,000 training examples per second. Use of level sensitive scan logic provides the chip with 100% fault coverage, permitting very reliable neural systems to be built.


A Model of Distributed Sensorimotor Control in the Cockroach Escape Turn

Neural Information Processing Systems

In response to a puff of wind, the American cockroach turns away and runs. The circuit underlying the initial turn of this escape response consists of three populations of individually identifiable nerve cells and appears to employ distributedrepresentations in its operation. We have reconstructed several neuronal and behavioral properties of this system using simplified neural network models and the backpropagation learning algorithm constrained byknown structural characteristics of the circuitry. In order to test and refine the model, we have also compared the model's responses to various lesions with the insect's responses to similar lesions.


How Receptive Field Parameters Affect Neural Learning

Neural Information Processing Systems

Omohundro ICSI 1947 Center St., Suite 600 Berkeley, CA 94704 We identify the three principle factors affecting the performance of learning bynetworks with localized units: unit noise, sample density, and the structure of the target function. We then analyze the effect of unit receptive fieldparameters on these factors and use this analysis to propose a new learning algorithm which dynamically alters receptive field properties during learning.


Analog Computation at a Critical Point: A Novel Function for Neuronal Oscillations?

Neural Information Processing Systems

Static correlations amongspike trains obtained from simulations of large arrays of cells are in agreement with the predictions from these Hamiltonians, and dynamic correlat.ionsdisplay



EMPATH: Face, Emotion, and Gender Recognition Using Holons

Neural Information Processing Systems

The network is trained to simply reproduce its input, and so can as a nonlinear version of Kohonen's (1977) auto-associator. However it must through a narrow channel of hidden units, so it must extract regularities from the during learning. Empirical analysis of the trained network showed that the span the principal subspace of the image vectors, with some noise on the component due to network nonlinearity (Cottrell & Munro, 1988).