Industry
Note on Development of Modularity in Simple Cortical Models
Chernajvsky, Alex, Moody, John E.
We show that localized activity patterns in a layer of cells, collective excitations, can induce the formation of modular structures in the anatomical connections via a Hebbian learning mechanism. The networks are spatially homogeneous before learning, but the spontaneous emergenceof localized collective excitations and subsequently modularity in the connection patterns breaks translational symmetry. This spontaneous symmetry breaking phenomenon is similar to those which drive pattern formation in reaction-diffusion systems. We have identified requirements on the patterns of lateral connections and on the gains of internal units which are essential for the development of modularity. These essential requirements will most likely remain operative when more complicated (and biologically realistic)models are considered.
Neural Implementation of Motivated Behavior: Feeding in an Artificial Insect
Beer, Randall D., Chiel, Hillel J.
Most complex behaviors appear to be governed by internal motivational statesor drives that modify an animal's responses to its environment. It is therefore of considerable interest to understand the neural basis of these motivational states. Drawing upon work on the neural basis of feeding in the marine mollusc Aplysia, we have developed a heterogeneous artificial neural network for controlling thefeeding behavior of a simulated insect. We demonstrate that feeding in this artificial insect shares many characteristics with the motivated behavior of natural animals. 1 INTRODUCTION While an animal's external environment certainly plays an extremely important role in shaping its actions, the behavior of even simpler animals is by no means solely reactive. The response of an animal to food, for example, cannot be explained only in terms of the physical stimuli involved. On two different occasions, the very same animal may behave in completely different ways when presented with seemingly identical pieces of food (e.g.
Optimal Brain Damage
LeCun, Yann, Denker, John S., Solla, Sara A.
We have used information-theoretic ideas to derive a class of practical andnearly optimal schemes for adapting the size of a neural network. By removing unimportant weights from a network, several improvementscan be expected: better generalization, fewer training examples required, and improved speed of learning and/or classification. The basic idea is to use second-derivative information tomake a tradeoff between network complexity and training set error. Experiments confirm the usefulness of the methods on a real-world application. 1 INTRODUCTION Most successful applications of neural network learning to real-world problems have been achieved using highly structured networks of rather large size [for example (Waibel, 1989; Le Cun et al., 1990a)]. As applications become more complex, the networks will presumably become even larger and more structured.
Effects of Firing Synchrony on Signal Propagation in Layered Networks
Kenyon, G. T., Fetz, Eberhard E., Puff, R. D.
Spiking neurons which integrate to threshold and fire were used to study the transmission of frequency modulated (FM) signals through layered networks. Firing correlations between cells in the input layer were found to modulate the transmission of FM signals undercertain dynamical conditions. A tonic level of activity was maintained by providing each cell with a source of Poissondistributed synapticinput. When the average membrane depolarization produced by the synaptic input was sufficiently below threshold, the firing correlations between cells in the input layer could greatly amplify the signal present in subsequent layers. When the depolarization was sufficiently close to threshold, however, the firing synchrony between cells in the initial layers could no longer effect the propagation of FM signals. In this latter case, integrateand-fire neuronscould be effectively modeled by simpler analog elements governed by a linear input-output relation.
Computer Simulation of Oscillatory Behavior in Cerebral Cortical Networks
Wilson, Matthew A., Bower, James M.
It has been known for many years that specific regions of the working cerebralcortex display periodic variations in correlated cellular activity. While the olfactory system has been the focus of much of this work, similar behavior has recently been observed in primary visual cortex. We have developed models of both the olfactory and visual cortex which replicate the observed oscillatory properties ofthese networks. Using these models we have examined the dependence of oscillatory behavior on single cell properties and network architectures.We discuss the idea that the oscillatory events recorded from cerebral cortex may be intrinsic to the architecture of cerebral cortex as a whole, and that these rhythmic patterns may be important in coordinating neuronal activity during sensory processmg.
Real-Time Computer Vision and Robotics Using Analog VLSI Circuits
Koch, Christof, Bair, Wyeth, Harris, John G., Horiuchi, Timothy K., Hsu, Andrew, Luo, Jin
The long-term goal of our laboratory is the development of analog resistive network-based VLSI implementations of early and intermediate visionalgorithms. We demonstrate an experimental circuit for smoothing and segmenting noisy and sparse depth data using the resistive fuse and a 1-D edge-detection circuit for computing zero-crossingsusing two resistive grids with different spaceconstants. Todemonstrate the robustness of our algorithms and of the fabricated analog CMOS VLSI chips, we are mounting these circuits onto small mobile vehicles operating in a real-time, laboratory environment.
Neural Network Analysis of Distributed Representations of Dynamical Sensory-Motor Transformations in the Leech
Lockery, Shawn R., Fang, Yan, Sejnowski, Terrence J.
Neu.ยทal Network Analysis of Distributed Representations of Dynamical Sensory-Motor rrransformations in the Leech Shawn R. LockerYt Van Fangt and Terrence J. Sejnowski Computational Neurobiology Laboratory Salk Institute for Biological Studies Box 85800, San Diego, CA 92138 ABSTRACT Interneurons in leech ganglia receive multiple sensory inputs and make synaptic contacts with many motor neurons. These "hidden" units coordinate several different behaviors. We used physiological and anatomical constraints to construct a model of the local bending reflex. Dynamical networks were trained on experimentally derived input-output patterns using recurrent back-propagation. Units in the model were modified to include electrical synapses and multiple synaptic time constants.
Associative Memory in a Simple Model of Oscillating Cortex
A generic model of oscillating cortex, which assumes "minimal" coupling justified by known anatomy, is shown to function as an associative memory,using previously developed theory. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long range excitatofy connections. Using a local Hebb-like learning rule for primary and higher order synapses at the ends of the long range connections, the system learns to store the kinds of oscillation amplitudepatterns observed in olfactory and visual cortex. This rule is derived from a more general "projection algorithm" for recurrent analog networks, that analytically guarantees content addressable memory storage of continuous periodic sequences - capacity: N/2 Fourier components for an N node network - no "spurious" attractors. 1 Introduction This is a sketch of recent results stemming from work which is discussed completely in [1, 2, 3]. Patterns of 40 to 80 hz oscillation have been observed in the large scale activity of olfactory cortex [4] and visual neocortex [5], and shown to predict the olfactory and visual pattern recognition responses of a trained animal.
Non-Boltzmann Dynamics in Networks of Spiking Neurons
Crair, Michael C., Bialek, William
We study networks of spiking neurons in which spikes are fired as a Poisson process. The state of a cell is determined by the instantaneous firingrate, and in the limit of high firing rates our model reduces to that studied by Hopfield. We find that the inclusion of spiking results in several new features, such as a noise-induced asymmetry between "on" and "off" states of the cells and probability currentswhich destroy the usual description of network dynamics interms of energy surfaces. Taking account of spikes also allows usto calibrate network parameters such as "synaptic weights" against experiments on real synapses. Realistic forms of the post synaptic response alters the network dynamics, which suggests a novel dynamical learning mechanism.