Goto

Collaborating Authors

 Machine Learning



A New Model of Spatial Representation in Multimodal Brain Areas

Neural Information Processing Systems

Most models of spatial representations in the cortex assume cells with limited receptive fields that are defined in a particular egocentric frameof reference. However, cells outside of primary sensory cortex are either gain modulated by postural input or partially shifting. We show that solving classical spatial tasks, like sensory prediction,multi-sensory integration, sensory-motor transformation andmotor control requires more complicated intermediate representations that are not invariant in one frame of reference. We present an iterative basis function map that performs these spatial tasks optimally with gain modulated and partially shifting units, and tests it against neurophysiological and neuropsychological data. In order to perform an action directed toward an object, it is necessary to have a representation of its spatial location.


Generalizable Singular Value Decomposition for Ill-posed Datasets

Neural Information Processing Systems

Becausethe training examples in an ill-posed data set do not fully span the signal space the observed training set variances in each basis vector will be too high compared to the average variance ofthe test set projections onto the same basis vectors. On basis of this understanding we introduce the Generalizable Singular ValueDecomposition (GenSVD) as a means to reduce this bias by re-estimation of the singular values obtained in a conventional Singular Value Decomposition, allowing for a generalization performance increaseof a subsequent statistical model. We demonstrate that the algorithm succesfully corrects bias in a data set from a functional PET activation study of the human brain. 1 Ill-posed Data Sets An ill-posed data set has more dimensions in each example than there are examples. Such data sets occur in many fields of research typically in connection with image measurements. The associated statistical problem is that of extracting structure from the observed high-dimensional vectors in the presence of noise. The statistical analysis can be done either supervised (Le.


From Mixtures of Mixtures to Adaptive Transform Coding

Neural Information Processing Systems

We establish a principled framework for adaptive transform coding. Transformcoders are often constructed by concatenating an ad hoc choice of transform with suboptimal bit allocation and quantizer design.Instead, we start from a probabilistic latent variable model in the form of a mixture of constrained Gaussian mixtures. From this model we derive a transform coding algorithm, which is a constrained version of the generalized Lloyd algorithm for vector quantizer design. A byproduct of our derivation is the introduction ofa new transform basis, which unlike other transforms (PCA, DCT, etc.) is explicitly optimized for coding. Image compression experiments show adaptive transform coders designed with our algorithm improvecompressed image signal-to-noise ratio up to 3 dB compared to global transform coding and 0.5 to 2 dB compared to other adaptive transform coders. 1 Introduction Compression algorithms for image and video signals often use transform coding as a low-complexity alternative to vector quantization (VQ).


Temporally Dependent Plasticity: An Information Theoretic Account

Neural Information Processing Systems

These spikes can be temporally weighted in many ways: from the computational point of view it is beneficial to weigh spikes uniformly along time, but this may require long "memory" and is biologically improbable.


Improved Output Coding for Classification Using Continuous Relaxation

Neural Information Processing Systems

Output coding is a general method for solving multiclass problems by reducing them to multiple binary classification problems. Previous research onoutput coding has employed, almost solely, predefined discrete codes. We describe an algorithm that improves the performance of output codes by relaxing them to continuous codes. The relaxation procedure is cast as an optimization problem and is reminiscent of the quadratic program for support vector machines. We describe experiments with the proposed algorithm, comparing it to standard discrete output codes. The experimental results indicate that continuous relaxations of output codes often improve the generalization performance, especially for short codes.



The Missing Link - A Probabilistic Model of Document Content and Hypertext Connectivity

Neural Information Processing Systems

We describe a joint probabilistic model for modeling the contents and inter-connectivity of document collections such as sets of web pages or research paper archives. The model is based on a probabilistic factor decomposition and allows identifying principal topics of the collection as well as authoritative documents within those topics. Furthermore, the relationships between topics is mapped out in order to build a predictive model of link content. Among the many applications of this approach are information retrieval and search, topic identification, query disambiguation, focusedweb crawling, web authoring, and bibliometric analysis.



Incremental and Decremental Support Vector Machine Learning

Neural Information Processing Systems

An online recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, and decremental "unlearning" offers an efficient method to exactly evaluate leave-one-out generalization performance.