Independent Components Analysis through Product Density Estimation

Neural Information Processing Systems

We present a simple direct approach for solving the ICA problem, using density estimation and maximum likelihood. Given a candidate orthogonalframe, we model each of the coordinates using a semi-parametric density estimate based on cubic splines. Since our estimates have two continuous derivatives, we can easily run a second ordersearch for the frame parameters. Our method performs very favorably when compared to state-of-the-art techniques. 1 Introduction Independent component analysis (ICA) is a popular enhancement over principal component analysis (PCA) and factor analysis. IRP which is assumed to arise from a linear mixing of a latent random source vector S E IRP, (1) X AS; the components Sj, j 1, ...,p of S are assumed to be independently distributed.

Learning Graphical Models with Mercer Kernels

Neural Information Processing Systems

We present a class of algorithms for learning the structure of graphical models from data. The algorithms are based on a measure known as the kernel generalized variance (KGV), which essentially allows us to treat all variables on an equal footing as Gaussians in a feature space obtained from Mercer kernels. Thus we are able to learn hybrid graphs involving discrete and continuous variables of arbitrary type. We explore the computational properties of our approach, showing how to use the kernel trick to compute the relevant statistics in linear time. We illustrate our framework with experiments involving discrete and continuous data.


Neural Information Processing Systems

High dimensional data modeling is difficult mainly because the so-called "curse of dimensionality". We propose a technique called "Gaussianization" forhigh dimensional density estimation, which alleviates the curse of dimensionality by exploiting the independence structures in the data. Gaussianization is motivated from recent developments in the statistics literature: projection pursuit, independent component analysis and Gaussian mixturemodels with semi-tied covariances. We propose an iterative Gaussianizationprocedure which converges weakly: at each iteration, thedata is first transformed to the least dependent coordinates and then each coordinate is marginally Gaussianized by univariate techniques.

ICA based on a Smooth Estimation of the Differential Entropy

Neural Information Processing Systems

In this paper we introduce the MeanNN approach for estimation of main information theoreticmeasures such as differential entropy, mutual information and divergence. As opposed to other nonparametric approaches the MeanNN results in smooth differentiable functions of the data samples with clear geometrical interpretation. Thenwe apply the proposed estimators to the ICA problem and obtain a smooth expression for the mutual information that can be analytically optimized by gradient descent methods. The improved performance of the proposed ICA algorithm is demonstrated on several test examples in comparison with state-ofthe-art techniques.

Informative Discriminant Analysis

AAAI Conferences

The components have traditionally been used for classification. They construct class borders that are Bayes-optimal for that purpose (in the two-class case), assuming the classes are normally distributed and share the same covariance matrix. LDA has been used for visualizing multivariate data by projecting them to planes spanned by main discriminant direction pairs. An example will be presented on visualizing relationships in the behavior of genes of different functional classes in a set of knockout mutation experiments. Another example could be to collect financial indicators from a set of companies, and visualize relationships of companies that will go bankrupt after 1 year, 2 years, etc.