vae and flow
SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows
Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. However, they both impose constraints on the models: Normalizing flows use bijective transformations to model densities whereas VAEs learn stochastic transformations that are non-invertible and thus typically do not provide tractable estimates of the marginal likelihood. In this paper, we introduce SurVAE Flows: A modular framework of composable transformations that encompasses VAEs and normalizing flows. SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood. We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows. Finally, we introduce common operations such as the max value, the absolute value, sorting and stochastic permutation as composable layers in SurVAE Flows.
Review for NeurIPS paper: SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows
The authors introduce a novel conceptual framework that unifies normalizing flows and VAEs and includes many other existing models and modules such as augmented flows and variational dequantization. The framework involves thinking about generative models in terms of the type of mapping they use to go from the observation to the latents and vice versa. This turns out to be fruitful because it immediately makes apparent the gap between flows, which use deterministic mappings in both directions, and VAEs, which use stochastic mappings. The authors fill this gap by introducing surjective models/components which are deterministic in one of the directions and stochastic in the other, and proceed to derive several instances of these, e.g. The reviewers found the paper insightful and praised the quality of exposition.
SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows
Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. However, they both impose constraints on the models: Normalizing flows use bijective transformations to model densities whereas VAEs learn stochastic transformations that are non-invertible and thus typically do not provide tractable estimates of the marginal likelihood. In this paper, we introduce SurVAE Flows: A modular framework of composable transformations that encompasses VAEs and normalizing flows. SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood. We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows. Finally, we introduce common operations such as the max value, the absolute value, sorting and stochastic permutation as composable layers in SurVAE Flows.