Goto

Collaborating Authors

 surjection



SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Neural Information Processing Systems

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. However, they both impose constraints on the models: Normalizing flows use bijective transformations to model densities whereas VAEs learn stochastic transformations that are non-invertible and thus typically do not provide tractable estimates of the marginal likelihood. In this paper, we introduce SurVAE Flows: A modular framework of composable transformations that encompasses VAEs and normalizing flows. SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood. We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows. Finally, we introduce common operations such as the max value, the absolute value, sorting and stochastic permutation as composable layers in SurVAE Flows.





Review for NeurIPS paper: SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Neural Information Processing Systems

The authors introduce a novel conceptual framework that unifies normalizing flows and VAEs and includes many other existing models and modules such as augmented flows and variational dequantization. The framework involves thinking about generative models in terms of the type of mapping they use to go from the observation to the latents and vice versa. This turns out to be fruitful because it immediately makes apparent the gap between flows, which use deterministic mappings in both directions, and VAEs, which use stochastic mappings. The authors fill this gap by introducing surjective models/components which are deterministic in one of the directions and stochastic in the other, and proceed to derive several instances of these, e.g. The reviewers found the paper insightful and praised the quality of exposition.


SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Neural Information Processing Systems

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. However, they both impose constraints on the models: Normalizing flows use bijective transformations to model densities whereas VAEs learn stochastic transformations that are non-invertible and thus typically do not provide tractable estimates of the marginal likelihood. In this paper, we introduce SurVAE Flows: A modular framework of composable transformations that encompasses VAEs and normalizing flows. SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood. We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows. Finally, we introduce common operations such as the max value, the absolute value, sorting and stochastic permutation as composable layers in SurVAE Flows.


Transforming Geospatial Ontologies by Homomorphisms

Guo, Xiuzhan, Huang, Wei, Luo, Min, Rangarajan, Priya

arXiv.org Artificial Intelligence

An ontology was considered as an explicit specification of a conceptualization that provides the ways of thinking about a domain [14]. Ontologies are the silver bullet for many applications, such as, database integration, peer to peer systems, e-commerce, etc. [13]. A geospatial ontology is an ontology that implements a set of geospatial entities in a hierarchical structure [7, 10, 27, 28]. In the age of artificial intelligence, geospatial data, from multiple platforms with many different types, not only is big, heterogeneous, connected, but also keeps changing continuously, which results in tremendous potential for dynamic relationships. Geospatial data, ontologies, and models must be robust enough to the dynamic changes. After mathematical operations, e.g., +,,, and, being introduced, natural numbers can be used not only to count but also to solve real life problems. The set of natural numbers, along with the operations, forms an algebraic system that can be studied by its properties without any internal details of the numbers and operation. These operations establish the relations among natural numbers, which make more sense than isolated natural numbers. Geospatial ontologies are not isolated but connected by their relations.


Funnels: Exact maximum likelihood with dimensionality reduction

Klein, Samuel, Raine, John A., Pina-Otey, Sebastian, Voloshynovskiy, Slava, Golling, Tobias

arXiv.org Machine Learning

Normalizing flows are diffeomorphic, typically dimension-preserving, models trained using the likelihood of the model. We use the SurVAE framework to construct dimension reducing surjective flows via a new layer, known as the funnel. We demonstrate its efficacy on a variety of datasets, and show it improves upon or matches the performance of existing flows while having a reduced latent space size. The funnel layer can be constructed from a wide range of transformations including restricted convolution and feed forward layers.


SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Nielsen, Didrik, Jaini, Priyank, Hoogeboom, Emiel, Winther, Ole, Welling, Max

arXiv.org Machine Learning

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. However, they both impose constraints on the models: Normalizing flows use bijective transformations to model densities whereas VAEs learn stochastic transformations that are non-invertible and thus typically do not provide tractable estimates of the marginal likelihood. In this paper, we introduce SurVAE Flows: A modular framework of composable transformations that encompasses VAEs and normalizing flows. SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood. We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows. Finally, we introduce common operations such as the max value, the absolute value, sorting and stochastic permutation as composable layers in SurVAE Flows.