Goto

Collaborating Authors

 Country


Features as Sufficient Statistics

Neural Information Processing Systems

An image is often represented by a set of detected features. We get an enormous compression by representing images in this way. Furthermore, weget a representation which is little affected by small amounts of noise in the image. However, features are typically chosen in an ad hoc manner.


Hierarchical Non-linear Factor Analysis and Topographic Maps

Neural Information Processing Systems

We first describe a hierarchical, generative model that can be viewed as a nonlinear generalisation of factor analysis and can be implemented in a neural network. The model performs perceptual inferencein a probabilistically consistent manner by using top-down, bottom-up and lateral connections. These connections can be learned using simple rules that require only locally available information.We then show how to incorporate lateral connections intothe generative model. The model extracts a sparse, distributed, hierarchical representation of depth from simplified random-dot stereograms and the localised disparity detectors in the first hidden layer form a topographic map. When presented with image patches from natural scenes, the model develops topographically organisedlocal feature detectors.


Effects of Spike Timing Underlying Binocular Integration and Rivalry in a Neural Model of Early Visual Cortex

Neural Information Processing Systems

In normal vision, the inputs from the two eyes are integrated intoa single percept. When dissimilar images are presented to the two eyes, however, perceptual integration givesway to alternation between monocular inputs, a phenomenon called binocular rivalry. Although recent evidence indicates that binocular rivalry involves a modulation ofneuronal responses in extrastriate cortex, the basic mechanisms responsible for differential processing of con:6.icting


Mapping a Manifold of Perceptual Observations

Neural Information Processing Systems

Nonlinear dimensionality reduction is formulated here as the problem of trying to find a Euclidean feature-space embedding of a set of observations that preserves as closely as possible their intrinsic metric structure - the distances between points on the observation manifold as measured along geodesic paths.


A 1, 000-Neuron System with One Million 7-bit Physical Interconnections

Neural Information Processing Systems

An asynchronous PDM (Pulse-Density-Modulating) digital neural network system has been developed in our laboratory. It consists of one thousand neurons that are physically interconnected via one million 7-bit synapses. It can solve one thousand simultaneous nonlinear first-order differential equations in a fully parallel and continuous fashion. The performance of this system was measured by a winner-take-all network with one thousand neurons. Although the magnitude of the input and network parameters were identical foreach competing neuron, one of them won in 6 milliseconds.


Blind Separation of Radio Signals in Fading Channels

Neural Information Processing Systems

We apply information maximization / maximum likelihood blind source separation [2, 6) to complex valued signals mixed with complex valuednonstationary matrices. This case arises in radio communications withbaseband signals. We incorporate known source signal distributions in the adaptation, thus making the algorithms less "blind". This results in drastic reduction of the amount of data needed for successful convergence. Adaptation to rapidly changing signal mixing conditions, such as to fading in mobile communications, becomesnow feasible as demonstrated by simulations. 1 Introduction In SDMA (spatial division multiple access) the purpose is to separate radio signals of interfering users (either intentional or accidental) from each others on the basis of the spatial characteristics of the signals using smart antennas, array processing, and beamforming [5, 8).


An Analog VLSI Model of the Fly Elementary Motion Detector

Neural Information Processing Systems

Flies are capable of rapidly detecting and integrating visual motion information inbehaviorly-relevant ways. The first stage of visual motion processing in flies is a retinotopic array of functional units known as elementary motiondetectors (EMDs). Several decades ago, Reichardt and colleagues developed a correlation-based model of motion detection that described the behavior of these neural circuits. We have implemented a variant of this model in a 2.0-JLm analog CMOS VLSI process. The result isa low-power, continuous-time analog circuit with integrated photoreceptors thatresponds to motion in real time. The responses of the circuit to drifting sinusoidal gratings qualitatively resemble the temporal frequency response, spatial frequency response, and direction selectivity of motion-sensitive neurons observed in insects. In addition to its possible engineeringapplications, the circuit could potentially be used as a building block for constructing hardware models of higher-level insect motion integration.


Factorizing Multivariate Function Classes

Neural Information Processing Systems

The mathematical framework for factorizing equivalence classes of multivariate functions is formulated in this paper. Independent component analysis is shown to be a special case of this decomposition.


Generalization in Decision Trees and DNF: Does Size Matter?

Neural Information Processing Systems

Recent theoretical results for pattern classification with thresholded real-valuedfunctions (such as support vector machines, sigmoid networks,and boosting) give bounds on misclassification probability that do not depend on the size of the classifier, and hence can be considerably smaller than the bounds that follow from the VC theory. In this paper, we show that these techniques can be more widely applied, by representing other boolean functions as two-layer neural networks (thresholded convex combinations of boolean functions).


Minimax and Hamiltonian Dynamics of Excitatory-Inhibitory Networks

Neural Information Processing Systems

A Lyapunov function for excitatory-inhibitory networks is constructed. The construction assumes symmetric interactions within excitatory and inhibitory populations of neurons, and antisymmetric interactions between populations.The Lyapunov function yields sufficient conditions for the global asymptotic stability of fixed points. If these conditions are violated, limit cycles may be stable. The relations of the Lyapunov function to optimization theory and classical mechanics are revealed by minimax and dissipative Hamiltonian forms of the network dynamics. The dynamics of a neural network with symmetric interactions provably converges to fixed points under very general assumptions[l, 2].