Goto

Collaborating Authors

EM Algorithms for PCA and SPCA

Neural Information Processing Systems

I present an expectation-maximization (EM) algorithm for principal component analysis (PCA). The algorithm allows a few eigenvectors and eigenvalues to be extracted from large collections of high dimensional data. It is computationally very efficient in space and time.


An Annealed Self-Organizing Map for Source Channel Coding

Neural Information Processing Systems

It is especially suited for speech and image data which in many applieations have to be transmitted under low bandwidth/high noise level conditions. Following the idea of (Farvardin, 1990) and (Luttrell, 1989) of jointly optimizing the codebook and the data representation w.r.t. to a given channel noise we apply a deterministic annealing scheme (Rose, 1990; Buhmann, 1997) to the problem and develop a An Annealed Self-Organizing Map for Source Channel Coding 431 soft topographic vector quantization algorithm (STVQ) (cf.


Learning to Order Things

Neural Information Processing Systems

Most previous work in inductive learning has concentrated on learning to classify. However, there are many applications in which it is desirable to order rather than classify instances. An example might be a personalized email filter that gives a priority ordering to unread mail. Here we will consider the problem of learning how to construct such orderings, given feedback in the form of preference judgments, i.e., statements that one instance should be ranked ahead of another. Such orderings could be constructed based on a learned classifier or regression model, and in fact often are.



Reinforcement Learning for Call Admission Control and Routing in Integrated Service Networks

Neural Information Processing Systems

We provide a model of the standard watermaze task, and of a more challenging task involving novel platform locations, in which rats exhibit one-trial learning after a few days of training. The model uses hippocampal place cells to support reinforcement learning, and also, in an integrated manner, to build and use allocentric coordinates. 1 INTRODUCTION


Analytical Study of the Interplay between Architecture and Predictability

Neural Information Processing Systems

We study model feed forward networks as time series predictors in the stationary limit. The focus is on complex, yet non-chaotic, behavior. The main question we address is whether the asymptotic behavior is governed by the architecture, regardless the details of the weights. We find hierarchies among classes of architectures with respect to the attract or dimension of the long term sequence they are capable of generating; larger number of hidden units can generate higher dimensional attractors. In the case of a perceptron, we develop the stationary solution for general weights, and show that the flow is typically one dimensional.


Task and Spatial Frequency Effects on Face Specialization

Neural Information Processing Systems

There is strong evidence that face processing is localized in the brain. The double dissociation between prosopagnosia, a face recognition deficit occurring after brain damage, and visual object agnosia, difficulty recognizing otber kinds of complex objects, indicates tbat face and nonface object recognition may be served by partially independent mechanisms in the brain. Is neural specialization innate or learned? We suggest that this specialization could be tbe result of a competitive learning mechanism that, during development, devotes neural resources to the tasks they are best at performing. Furtber, we suggest that the specialization arises as an interaction between task requirements and developmental constraints. In this paper, we present a feed-forward computational model of visual processing, in which two modules compete to classify input stimuli. When one module receives low spatial frequency information and the other receives high spatial frequency information, and the task is to identify the faces while simply classifying the objects, the low frequency network shows a strong specialization for faces. No otber combination of tasks and inputs shows this strong specialization. We take these results as support for the idea that an innately-specified face processing module is unnecessary.


Features as Sufficient Statistics

Neural Information Processing Systems

An image is often represented by a set of detected features. We get an enormous compression by representing images in this way. Furthermore, we get a representation which is little affected by small amounts of noise in the image. However, features are typically chosen in an ad hoc manner.


Shared Context Probabilistic Transducers

Neural Information Processing Systems

Recently, a model for supervised learning of probabilistic transducers represented by suffix trees was introduced. However, this algorithm tends to build very large trees, requiring very large amounts of computer memory. In this paper, we propose anew, more compact, transducer model in which one shares the parameters of distributions associated to contexts yielding similar conditional output distributions. We illustrate the advantages of the proposed algorithm with comparative experiments on inducing a noun phrase recogmzer.


Ensemble and Modular Approaches for Face Detection: A Comparison

Neural Information Processing Systems

A new learning model based on autoassociative neural networks is developped and applied to face detection. To extend the detection ability in orientation and to decrease the number of false alarms, different combinations of networks are tested: ensemble, conditional ensemble and conditional mixture of networks. The use of a conditional mixture of networks allows to obtain state of the art results on different benchmark face databases. The set of all possible windows is E V uN, with V n N 0. Since collecting a representative set of non-face examples is impossible, face detection by a statistical model is a difficult task. An autoassociative network, using five layers of neurons, is able to perform a nonlinear dimensionnality reduction [Kramer, 1991].