Goto

Collaborating Authors

 realnvp





providing detailed response for each reviewer, we would like to address two common issues raised by the reviewers

Neural Information Processing Systems

We are very grateful to the reviewers for reading the manuscript in detail and providing helpful comments. We will add more details about the background to the appendix and clarify notation accordingly. The ELBO (K=1) for all methods are reported in the following table. The gain is more significant now. Y es, there are strong correlations among branch lengths.




Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference

Wiriyapong, Benjamin, Karakuş, Oktay, Sidorov, Kirill

arXiv.org Machine Learning

Normalising-flow variational inference (VI) can approximate complex posteriors, yet single-flow models often behave inconsistently across qualitatively different distributions. We propose Adaptive Mixture Flow Variational Inference (AMF-VI), a heterogeneous mixture of complementary flows (MAF, Re-alNVP, RBIG) trained in two stages: (i) sequential expert training of individual flows, and (ii) adaptive global weight estimation via likelihood-driven updates, without per-sample gating or architectural changes. Evaluated on six canonical posterior families of banana, X-shape, two-moons, rings, a bimodal, and a five-mode mixture, AMF-VI achieves consistently lower negative log-likelihood than each single-flow baseline and delivers stable gains in transport metrics (Wasserstein-2) and maximum mean discrepancy (MDD), indicating improved robustness across shapes and modalities. The procedure is efficient and architecture-agnostic, incurring minimal overhead relative to standard flow training, and demonstrates that adaptive mixtures of diverse flows provide a reliable route to robust VI across diverse posterior families whilst preserving each expert's inductive bias.