Goto

Collaborating Authors

 ess-flow


ESS-Flow: Training-free guidance of flow-based models as inference in source space

Kalaivanan, Adhithyan, Zhao, Zheng, Sjölund, Jens, Lindsten, Fredrik

arXiv.org Machine Learning

Guiding pretrained flow-based generative models for conditional generation or to produce samples with desired target properties enables solving diverse tasks without retraining on paired data. We present ESS-Flow, a gradient-free method that leverages the typically Gaussian prior of the source distribution in flow-based models to perform Bayesian inference directly in the source space using Elliptical Slice Sampling. ESS-Flow only requires forward passes through the generative model and observation process, no gradient or Jacobian computations, and is applicable even when gradients are unreliable or unavailable, such as with simulation-based observations or quantization in the generation or observation process. We demonstrate its effectiveness on designing materials with desired target properties and predicting protein structures from sparse inter-residue distance measurements. In generative modeling, we are given data samples and aim to construct a sampler that approximates the data distribution. Diffusion models (Ho et al., 2020; Song et al., 2021) and continuous normalizing flows (Lipman et al., 2023; Liu et al., 2023; Albergo et al., 2023) achieve this by transporting samples from a simple source distribution to the data distribution.