Goto

Collaborating Authors

 Iain Murray


Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation

Neural Information Processing Systems

Many statistical models can be simulated forwards but have intractable likelihoods. Approximate Bayesian Computation (ABC) methods are used to infer properties of these models from data. Traditionally these methods approximate the posterior over parameters by conditioning on data being inside an ɛ-ball around the observed data, which is only correct in the limit ɛ 0. Monte Carlo methods can then draw samples from the approximate posterior to approximate predictions or error bars on parameters.


Masked Autoregressive Flow for Density Estimation

Neural Information Processing Systems

Autoregressive models are among the best performing neural density estimators. We describe an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data. By constructing a stack of autoregressive models, each modelling the random numbers of the next model in the stack, we obtain a type of normalizing flow suitable for density estimation, which we call Masked Autoregressive Flow. This type of flow is closely related to Inverse Autoregressive Flow and is a generalization of Real NVP. Masked Autoregressive Flow achieves state-of-the-art performance in a range of general-purpose density estimation tasks.