Goto

Collaborating Authors

 googol gaussian


A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

Neural Information Processing Systems

Generative models produce realistic objects in many domains, including text, image, video, and audio synthesis. Most popular models--Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs)--usually employ a standard Gaussian distribution as a prior. Previous works show that the richer family of prior distributions may help to avoid the mode collapse problem in GANs and to improve the evidence lower bound in VAEs. We propose a new family of prior distributions--Tensor Ring Induced Prior (TRIP)--that packs an exponential number of Gaussians into a high-dimensional lattice with a relatively small number of parameters. We show that these priors improve Fréchet Inception Distance for GANs and Evidence Lower Bound for VAEs. We also study generative models with TRIP in the conditional generation setup with missing conditions. Altogether, we propose a novel plug-and-play framework for generative models that can be utilized in any GAN and VAE-like architectures.


A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

Neural Information Processing Systems

Generative models produce realistic objects in many domains, including text, image, video, and audio synthesis. Most popular models--Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs)--usually employ a standard Gaussian distribution as a prior. Previous works show that the richer family of prior distributions may help to avoid the mode collapse problem in GANs and to improve the evidence lower bound in VAEs. We propose a new family of prior distributions--Tensor Ring Induced Prior (TRIP)--that packs an exponential number of Gaussians into a high-dimensional lattice with a relatively small number of parameters. We show that these priors improve Fréchet Inception Distance for GANs and Evidence Lower Bound for VAEs.


Reviews: A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

Neural Information Processing Systems

I have read the author response and other reviews and decided to keep my original score of 7. Summary: The paper proposes a family of priors for GANs and VAEs. These priors are mixtures of Gaussians with a large number of components but which can be represented using few number of learnable parameters using tensor ring decomposition. This family of priors enable efficient marginalization and conditioning. The method is applicable to both discrete and continuous latent variables. The method is extended to conditional generative modeling; in particular missing values in the conditioning variable can be marginalized out.


Reviews: A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

Neural Information Processing Systems

The paper introduces a novel way of parameterizing a mixture of Gaussians with exponentially many modes by using the Tensor Train decomposition to capture the dependence between the mixing variables of the per-dimension 1D Gaussian mixtures. The resulting distribution, which supports efficient marginalization and conditioning, is then used as a prior in VAEs and GANs. The reviewers agreed that the idea is novel and interesting and the paper is well written. The authors have addressed in the rebuttal some of the reviewer concerns about the mismatch in the number of parameters in the proposed prior and the baseline priors. The main remaining weakness of the paper is the lack baselines with strong priors (e.g.


A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

Neural Information Processing Systems

Generative models produce realistic objects in many domains, including text, image, video, and audio synthesis. Most popular models--Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs)--usually employ a standard Gaussian distribution as a prior. Previous works show that the richer family of prior distributions may help to avoid the mode collapse problem in GANs and to improve the evidence lower bound in VAEs. We propose a new family of prior distributions--Tensor Ring Induced Prior (TRIP)--that packs an exponential number of Gaussians into a high-dimensional lattice with a relatively small number of parameters. We show that these priors improve Fréchet Inception Distance for GANs and Evidence Lower Bound for VAEs.


A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

Kuznetsov, Maxim, Polykovskiy, Daniil, Vetrov, Dmitry P., Zhebrak, Alex

Neural Information Processing Systems

Generative models produce realistic objects in many domains, including text, image, video, and audio synthesis. Most popular models--Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs)--usually employ a standard Gaussian distribution as a prior. Previous works show that the richer family of prior distributions may help to avoid the mode collapse problem in GANs and to improve the evidence lower bound in VAEs. We propose a new family of prior distributions--Tensor Ring Induced Prior (TRIP)--that packs an exponential number of Gaussians into a high-dimensional lattice with a relatively small number of parameters. We show that these priors improve Fréchet Inception Distance for GANs and Evidence Lower Bound for VAEs.