Amortized Inference of Multi-Modal Posteriors using Likelihood-Weighted Normalizing Flows
–arXiv.org Artificial Intelligence
Across diverse domains--from complex systems and finance to high-energy physics and astrophysics--scientific inquiry often relies on deriving theoretical parameters from observational data [1]. At the core of this challenge lies the inverse problem: inferring the posterior distribution of theoretical parameters given a set of observables [2]. Traditional approaches for posterior estimation rely on sampling algorithms such as Markov Chain Monte Carlo (MCMC) [3, 4] and Nested Sampling (NS) [5]. In astrophysics and cosmology, implementations like emcee [6] and dynesty [7] have become standard tools. While these frameworks are statistically robust, they suffer significantly from the curse of dimensionality. In real-world scenarios, where the parameter space is high-dimensional and the likelihood function relies on computationally expensive simulators (e.g., in particle physics phenomenology [8]), convergence can take weeks or even months. Recent advances in machine learning have introduced Normalizing Flows (NFs) as a powerful alternative for probabilistic modelling [9, 10]. By learning a bijective mapping between a simple base distribution (e.g., a Gaussian) and the complex target distribution, NFs allow for exact density estimation and efficient sampling [11] from the target distribution. Modern architectures, such as RealNVP [12] and Neural Spline Flows [13], offer enough expressivity to model highly complex distributions.
arXiv.org Artificial Intelligence
Dec-5-2025