Plotting

 Opper, Manfred


Variational inference for Markov jump processes

Neural Information Processing Systems

Markov jump processes play an important role in a large number of application domains. However, realistic systems are analytically intractable and they have traditionally been analysed using simulation based techniques, which do not provide a framework for statistical inference. We propose a mean field approximation to perform posterior inference and parameter estimation. The approximation allows a practical solution to the inference problem, {while still retaining a good degree of accuracy.} We illustrate our approach on two biologically motivated systems.


Variational Inference for Diffusion Processes

Neural Information Processing Systems

Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but non-trivial task, especially when the system is nonlinear and multi-modal. We propose a variational treatment of diffusion processes, which allows us to estimate these parameters by simple gradient techniques and which is computationally less demanding than most MCMC approaches. Furthermore, our parameter inference scheme does not break down when the time step gets smaller, unlike most current approaches. Finally, we show how a cheap estimate of the posterior over the parameters can be constructed based on the variational free energy.


An Approximate Inference Approach for the PCA Reconstruction Error

Neural Information Processing Systems

The problem of computing a resample estimate for the reconstruction error in PCA is reformulated as an inference problem with the help of the replica method. Using the expectation consistent (EC) approximation, theintractable inference problem can be solved efficiently using only two variational parameters. A perturbative correction to the result is computed and an alternative simplified derivation is also presented.


Expectation Consistent Free Energies for Approximate Inference

Neural Information Processing Systems

We propose a novel a framework for deriving approximations for intractable probabilisticmodels. This framework is based on a free energy (negative log marginal likelihood) and can be seen as a generalization of adaptive TAP [1, 2, 3] and expectation propagation (EP) [4, 5]. The free energy is constructed from two approximating distributions which encode different aspects of the intractable model such a single node constraints andcouplings and are by construction consistent on a chosen set of moments. We test the framework on a difficult benchmark problem with binary variables on fully connected graphs and 2D grid graphs. We find good performance using sets of moments which either specify factorized nodesor a spanning tree on the nodes (structured approximation). Surprisingly, the Bethe approximation gives very inferior results even on grids.


Approximate Analytical Bootstrap Averages for Support Vector Classifiers

Neural Information Processing Systems

We compute approximate analytical bootstrap averages for support vector classification using a combination of the replica method of statistical physics and the TAP approach for approximate inference. We test our method on a few datasets and compare it with exact averages obtained by extensive Monte-Carlo sampling.


Variational Linear Response

Neural Information Processing Systems

A general linear response method for deriving improved estimates of correlations in the variational Bayes framework is presented. Three applications are given and it is discussed how to use linear response as a general principle for improving mean field approximations.



Approximate Analytical Bootstrap Averages for Support Vector Classifiers

Neural Information Processing Systems

We compute approximate analytical bootstrap averages for support vector classificationusing a combination of the replica method of statistical physics and the TAP approach for approximate inference. We test our method on a few datasets and compare it with exact averages obtained by extensive Monte-Carlo sampling.


A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages

Neural Information Processing Systems

We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages obtained by Monte-Carlo sampling.


A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages

Neural Information Processing Systems

We apply the replica method of Statistical Physics combined with a variational methodto the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach onregression with Gaussian processes and compare our results with averages obtained by Monte-Carlo sampling.