bayesian conditional density estimation
Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
Many statistical models can be simulated forwards but have intractable likelihoods. Approximate Bayesian Computation (ABC) methods are used to infer properties of these models from data. Traditionally these methods approximate the posterior over parameters by conditioning on data being inside an ε-ball around the observed data, which is only correct in the limit ε 0. Monte Carlo methods can then draw samples from the approximate posterior to approximate predictions or error bars on parameters.
Reviews: Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
The most original part of the paper is Proposition 1, which is quite interesting. However, I have some doubts regarding the assumptions leading to formula (2). As explained in the appendix, this formula holds if q_theta is complex enough to make so that the KL distance is zero. Now, in a realistic example and with finite sample size, q_theta can't be very complex, otherwise it would over-fit. Hence, (2) holds only approximately.
Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
Papamakarios, George, Murray, Iain
Many statistical models can be simulated forwards but have intractable likelihoods. Approximate Bayesian Computation (ABC) methods are used to infer properties of these models from data. Traditionally these methods approximate the posterior over parameters by conditioning on data being inside an ε-ball around the observed data, which is only correct in the limit ε 0. Monte Carlo methods can then draw samples from the approximate posterior to approximate predictions or error bars on parameters. We propose a new approach to likelihood-free inference based on Bayesian conditional density estimation. Preliminary inferences based on limited simulation data are used to guide later simulations.