Goto

Collaborating Authors

Sisson, S. A.


Likelihood-free approximate Gibbs sampling

arXiv.org Machine Learning

Likelihood-free methods refer to procedures that perform likelihood-based statistical inference, but without direct evaluation of the likelihood function. This is attractive when the likelihood function is computationally prohibitive to evaluate due to dataset size or model complexity, or when the likelihood function is only known through a data generation process. Some classes of likelihood-free methods include pseudo-marginal methods (Beaumont 2003; Andrieu and Roberts 2009), indirect inference (Gourieroux et al. 1993) and approximate Bayesian computation (Sisson et al. 2018a). In particular, approximate Bayesian computation (ABC) methods form an approximation to the computationally intractable posterior distribution by firstly sampling parameter vectors from the prior, and conditional on these, generating synthetic datasets under the model. The parameter vectors are then weighted by how well a vector of summary statistics of the synthetic datasets matches the same summary statistics of the observed data. ABC methods have seen extensive application and development over the past 15 years.


High-dimensional ABC

arXiv.org Machine Learning

This Chapter, "High-dimensional ABC", is to appear in the forthcoming Handbook of Approximate Bayesian Computation (2018). It details the main ideas and concepts behind extending ABC methods to higher dimensions, with supporting examples and illustrations.


Overview of Approximate Bayesian Computation

arXiv.org Machine Learning

This Chapter, "Overview of Approximate Bayesian Computation", is to appear as the first chapter in the forthcoming Handbook of Approximate Bayesian Computation (2018). It details the main ideas and concepts behind ABC methods with many examples and illustrations.


ABC Samplers

arXiv.org Machine Learning

This Chapter, "ABC Samplers", is to appear in the forthcoming Handbook of Approximate Bayesian Computation (2018). It details the main ideas and algorithms used to sample from the ABC approximation to the posterior distribution, including methods based on rejection/importance sampling, MCMC and sequential Monte Carlo.