Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods
Gabrié, Marylou, Rotskoff, Grant M., Vanden-Eijnden, Eric
Markov Chain Monte Carlo (MCMC) algorithms (Liu, Since no data set from the target posterior distribution 2008) are nowadays the methods of choice to sample complex is available beforehand, the flow is typically posterior distributions. MCMC methods generate a trained using the reverse Kullback-Leibler (KL) sequence of configurations over which the time average of divergence that only requires samples from a base any suitable observable converges towards its ensemble average distribution. This strategy may perform poorly over some target distribution, here the posterior. This when the posterior is complicated and hard to is achieved by proposing new samples from a proposal density sample with an untrained normalizing flow. Here that is easy to sample, then accepting or rejecting them we explore a distinct training strategy, using the using a criterion that guarantees that the transition kernel of direct KL divergence as loss, in which samples the chain is in detailed balance with respect to the posterior from the posterior are generated by (i) assisting density: a popular choice is Metropolis-Hastings criterion.
Jul-16-2021