Goto

Collaborating Authors

 variational bayesian learning


Propagation Algorithms for Variational Bayesian Learning

Neural Information Processing Systems

Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoret(cid:173) ical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these re(cid:173) sults to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smooth(cid:173) ing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimen(cid:173) sionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set.


Handling Missing Data with Variational Bayesian Learning of ICA

Neural Information Processing Systems

Missing data is common in real-world datasets and is a problem for many estimation techniques. We have developed a variational Bayesian method to perform Independent Component Analysis (ICA) on high-dimensional data containing missing entries. Missing data are handled naturally in the Bayesian framework by integrating the generative density model. Mod- eling the distributions of the independent sources with mixture of Gaus- sians allows sources to be estimated with different kurtosis and skewness. The variational Bayesian method automatically determines the dimen- sionality of the data and yields an accurate density model for the ob- served data without overfitting problems.



Handling Missing Data with Variational Bayesian Learning of ICA

Chan, Kwokleung, Lee, Te-Won, Sejnowski, Terrence J.

Neural Information Processing Systems

Missing data is common in real-world datasets and is a problem for many estimation techniques. We have developed a variational Bayesian method to perform Independent Component Analysis (ICA) on high-dimensional data containing missing entries. Missing data are handled naturally in the Bayesian framework by integrating the generative density model. Modeling the distributions of the independent sources with mixture of Gaussians allows sources to be estimated with different kurtosis and skewness. The variational Bayesian method automatically determines the dimensionality of the data and yields an accurate density model for the observed data without overfitting problems. This allows direct probability estimation of missing values in the high dimensional space and avoids dimension reduction preprocessing which is not feasible with missing data.


Handling Missing Data with Variational Bayesian Learning of ICA

Chan, Kwokleung, Lee, Te-Won, Sejnowski, Terrence J.

Neural Information Processing Systems

Missing data is common in real-world datasets and is a problem for many estimation techniques. We have developed a variational Bayesian method to perform Independent Component Analysis (ICA) on high-dimensional data containing missing entries. Missing data are handled naturally in the Bayesian framework by integrating the generative density model. Modeling the distributions of the independent sources with mixture of Gaussians allows sources to be estimated with different kurtosis and skewness. The variational Bayesian method automatically determines the dimensionality of the data and yields an accurate density model for the observed data without overfitting problems. This allows direct probability estimation of missing values in the high dimensional space and avoids dimension reduction preprocessing which is not feasible with missing data.


Handling Missing Data with Variational Bayesian Learning of ICA

Chan, Kwokleung, Lee, Te-Won, Sejnowski, Terrence J.

Neural Information Processing Systems

Modeling the distributions of the independent sources with mixture of Gaussians allows sources to be estimated with different kurtosis and skewness. The variational Bayesian method automatically determines the dimensionality of the data and yields an accurate density model for the observed data without overfitting problems.