Goto

Collaborating Authors

 United States


Spiral Waves in Integrate-and-Fire Neural Networks

Neural Information Processing Systems

The formation of propagating spiral waves is studied in a randomly connected neural network composed of integrate-and-fire neurons with recovery period and excitatory connections using computer simulations. Network activity is initiated by periodic stimulation at a single point. The results suggest that spiral waves can arise in such a network via a sub-critical Hopf bifurcation. 1 Introduction


Biologically Plausible Local Learning Rules for the Adaptation of the Vestibulo-Ocular Reflex

Neural Information Processing Systems

Lisberger Department of Physiology W.M. Keck Foundation Center for Integrative Neuroscience University of California, San Fransisco, CA, 94143 Abstract The vestibulo-ocular reflex (VOR) is a compensatory eye movement that stabilizes images on the retina during head turns. Its magnitude, or gain, can be modified by visual experience during head movements. Possible learning mechanisms for this adaptation have been explored in a model of the oculomotor system based on anatomical and physiological constraints. Thelocal correlational learning rules in our model reproduce the adaptation and behavior of the VOR under certain parameter conditions. From these conditions, predictions for the time course of adaptation at the learning sites are made. 1 INTRODUCTION The primate oculomotor system is capable of maintaining the image of an object on the fovea even when the head and object are moving simultaneously.


An Object-Oriented Framework for the Simulation of Neural Nets

Neural Information Processing Systems

The field of software simulators for neural networks has been expanding veryrapidly in the last years but their importance is still being underestimated. They must provide increasing levels of assistance forthe design, simulation and analysis of neural networks. With our object-oriented framework (SESAME) we intend to show that very high degrees of transparency, manageability and flexibility forcomplex experiments can be obtained. SESAME's basic design philosophyis inspired by the natural way in which researchers explain their computational models. Experiments are performed with networks of building blocks, which can be extended very easily. Mechanismshave been integrated to facilitate the construction and analysis of very complex architectures.


Probability Estimation from a Database Using a Gibbs Energy Model

Neural Information Processing Systems

We present an algorithm for creating a neural network which produces accurateprobability estimates as outputs. The network implements aGibbs probability distribution model of the training database. This model is created by a new transformation relating the joint probabilities of attributes in the database to the weights (Gibbs potentials) of the distributed network model. The theory of this transformation is presented together with experimental results. Oneadvantage of this approach is the network weights are prescribed without iterative gradient descent. Used as a classifier the network tied or outperformed published results on a variety of databases.


Intersecting regions: The Key to combinatorial structure in hidden unit space

Neural Information Processing Systems

These results show how combinatorial structure can be based on the spatial nature of networks, and not just on their emulation of logical structure.


A Knowledge-Based Model of Geometry Learning

Neural Information Processing Systems

We propose a model of the development of geometric reasoning in children that explicitly involves learning. The model uses a neural network that is initialized with an understanding of geometry similar to that of second-grade children. Through the presentation of a series of examples, the model is shown to develop an understanding of geometry similar to that of fifth-grade children who were trained using similar materials.


Object-Based Analog VLSI Vision Circuits

Neural Information Processing Systems

We describe two successfully working, analog VLSI vision circuits that move beyond pixel-based early vision algorithms. One circuit, implementing the dynamic wires model, provides for dedicated lines of communication among groups of pixels that share a common property. The chip uses the dynamic wires model to compute the arclength of visual contours. Another circuit labels all points inside a given contour with one voltage and all other with another voltage. Itsbehavior is very robust, since small breaks in contours are automatically sealed, providing for Figure-Ground segregation in a noisy environment. Both chips are implemented using networks of resistors and switches and represent a step towards object level processing since a single voltage value encodes the property of an ensemble of pixels.


Unsupervised Discrimination of Clustered Data via Optimization of Binary Information Gain

Neural Information Processing Systems

We present the information-theoretic derivation of a learning algorithm that clusters unlabelled data with linear discriminants. In contrast to methods that try to preserve information about the input patterns, we maximize the information gained from observing the output of robust binary discriminators implemented with sigmoid nodes. We deri ve a local weight adaptation rule via gradient ascent in this objective, demonstrate its dynamics on some simple data sets, relate our approach to previous work and suggest directions in which it may be extended.


A Note on Learning Vector Quantization

Neural Information Processing Systems

Vector Quantization is useful for data compression. Competitive Learning whichminimizes reconstruction error is an appropriate algorithm for vector quantization of unlabelled data. Vector quantization of labelled data for classification has a different objective, to minimize the number of misclassifications, and a different algorithm is appropriate. We show that a variant of Kohonen's LVQ2.1 algorithm can be seen as a multiclass extensionof an algorithm which in a restricted 2 class case can be proven to converge to the Bayes optimal classification boundary. We compare the performance of the LVQ2.1 algorithm to that of a modified version having a decreasing window and normalized step size, on a ten class vowel classification problem.


Directional-Unit Boltzmann Machines

Neural Information Processing Systems

University of Colorado Boulder, CO 80309-0430 Abstract We present a general formulation for a network of stochastic directional units.This formulation is an extension of the Boltzmann machine in which the units are not binary, but take on values in a cyclic range, between 0 and 271' radians. The conditional distribution of a unit's stochastic state is a circular version of the Gaussian probability distribution, known as the von Mises distribution. This combination of a value and a certainty provides additional representational powerin a unit. Many kinds of information can naturally be represented in terms of angular, or directional, variables. A circular range forms a suitable representation for explicitly directional information, such as wind direction, as well as for information where the underlying range is periodic, such as days of the week or months of the year.