Goto

Collaborating Authors

 Information Technology


A Method for the Associative Storage of Analog Vectors

Neural Information Processing Systems

A method for storing analog vectors in Hopfield's continuous feedback modelis proposed. By analog vectors we mean vectors whose components are real-valued. The vectors to be stored are set as equilibria of the network. The network model consists of one layer of visible neurons and one layer of hidden neurons. We propose a learning algorithm, which results in adjusting the positions of the equilibria, as well as guaranteeing their stability.


Neural Network Analysis of Distributed Representations of Dynamical Sensory-Motor Transformations in the Leech

Neural Information Processing Systems

Neu.·al Network Analysis of Distributed Representations of Dynamical Sensory-Motor rrransformations in the Leech Shawn R. LockerYt Van Fangt and Terrence J. Sejnowski Computational Neurobiology Laboratory Salk Institute for Biological Studies Box 85800, San Diego, CA 92138 ABSTRACT Interneurons in leech ganglia receive multiple sensory inputs and make synaptic contacts with many motor neurons. These "hidden" units coordinate several different behaviors. We used physiological and anatomical constraints to construct a model of the local bending reflex. Dynamical networks were trained on experimentally derived input-output patterns using recurrent back-propagation. Units in the model were modified to include electrical synapses and multiple synaptic time constants.


Associative Memory in a Simple Model of Oscillating Cortex

Neural Information Processing Systems

A generic model of oscillating cortex, which assumes "minimal" coupling justified by known anatomy, is shown to function as an associative memory,using previously developed theory. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long range excitatofy connections. Using a local Hebb-like learning rule for primary and higher order synapses at the ends of the long range connections, the system learns to store the kinds of oscillation amplitudepatterns observed in olfactory and visual cortex. This rule is derived from a more general "projection algorithm" for recurrent analog networks, that analytically guarantees content addressable memory storage of continuous periodic sequences - capacity: N/2 Fourier components for an N node network - no "spurious" attractors. 1 Introduction This is a sketch of recent results stemming from work which is discussed completely in [1, 2, 3]. Patterns of 40 to 80 hz oscillation have been observed in the large scale activity of olfactory cortex [4] and visual neocortex [5], and shown to predict the olfactory and visual pattern recognition responses of a trained animal.


Non-Boltzmann Dynamics in Networks of Spiking Neurons

Neural Information Processing Systems

We study networks of spiking neurons in which spikes are fired as a Poisson process. The state of a cell is determined by the instantaneous firingrate, and in the limit of high firing rates our model reduces to that studied by Hopfield. We find that the inclusion of spiking results in several new features, such as a noise-induced asymmetry between "on" and "off" states of the cells and probability currentswhich destroy the usual description of network dynamics interms of energy surfaces. Taking account of spikes also allows usto calibrate network parameters such as "synaptic weights" against experiments on real synapses. Realistic forms of the post synaptic response alters the network dynamics, which suggests a novel dynamical learning mechanism.


A Neural Network for Feature Extraction

Neural Information Processing Systems

The paper suggests a statistical framework for the parameter estimation problemassociated with unsupervised learning in a neural network, leading to an exploratory projection pursuit network that performs feature extraction, or dimensionality reduction.


The Cocktail Party Problem: Speech/Data Signal Separation Comparison between Backpropagation and SONN

Neural Information Processing Systems

Parallel Distributed Structures Laboratory School of Electrical Engineering Purdue University W. Lafayette, IN. 47907 ChristophSchaefers ABSTRACT This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and compares its performance with Back Propagation in a signal separation application. The problem is to separate two signals; a modem data signal and a male speech signal, added and transmitted through a 4 khz channel. The signals are sampled at8 khz, and using supervised learning, an attempt is made to reconstruct them. The SONN is an algorithm that constructs its own network topology during training, which is shown to be much smaller than the BP network, faster to trained, and free from the trial-anderror networkdesign that characterize BP. 1. INTRODUCTION The research in Neural Networks has witnessed major changes in algorithm design focus, motivated by the limitations perceived in the algorithms available at the time. With the extensive work performed in that last few years using multilayered networks, it was soon discovered that these networks present limitations in tasks The Cocktail Party Problem: 543 that: (a) are difficult to determine problem complexity a priori, and thus design network of the correct size, (b) training not only takes prohibitively long times, but requires a large number of samples as well as fine parameter adjustment, without guarantee of convergence, (c) such networks do not handle the system identification task efficiently for systems whose time varying structure changes radically, and, (d) the trained network is little more than a black box of weights and connections, revealing little about the problem structure; being hard to find the justification for the algorithm weight choice, or an explanation for the output decisions based on an input vector.


Performance of Connectionist Learning Algorithms on 2-D SIMD Processor Arrays

Neural Information Processing Systems

The mapping of the back-propagation and mean field theory learning algorithms onto a generic 2-D SIMD computer is described. This architecture proves to be very adequate for these applications since efficiencies close to the optimum can be attained. Expressions to find the learning rates are given and then particularized to the DAP array procesor.




A Reconfigurable Analog VLSI Neural Network Chip

Neural Information Processing Systems

The distributed-neuron synapses are arranged inblocks of 16, which we call '4 x 4 tiles'. Switch matrices are interleaved between each of these tiles to provide programmability ofinterconnections. With a small area overhead (15 %), the 1024 units of the network can be rearranged in various configurations. Someof the possible configurations are, a 12-32-12 network, a 16-12-12-16 network, two 12-32 networks etc. (the numbers separated bydashes indicate the number of units per layer, including the input layer). Weights are stored in analog form on MaS capacitors.