Goto

Collaborating Authors

 gilbo


GILBO: One Metric to Measure Them All

Neural Information Processing Systems

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data-independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs each trained on four datasets (MNIST, FashionMNIST, CIFAR-10 and CelebA) and discuss the results.


GILBO: One Metric to Measure Them All

Alexander A. Alemi, Ian Fischer

Neural Information Processing Systems

It offers a data-independent measure of the complexity of the learned latent variable description, giving the log of the effective description length.



GILBO: One Metric to Measure Them All

Neural Information Processing Systems

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data-independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs each trained on four datasets (MNIST, FashionMNIST, CIFAR-10 and CelebA) and discuss the results.


Reviews: GILBO: One Metric to Measure Them All

Neural Information Processing Systems

Overall I think is a very good paper and it is one of the better papers I've seen addressing evaluating GANs. I myself are fairly skeptical of FID and have seen other works criticizing that approach, and this work sheds some light on the situation. I think anyone who follows this work would be better informed than work that introduced inception or FID in how to evaluate GANs. That said, there is some missing discussion or comparison to related work (notably mutual information neural estimation (MINE) by Belghazi et al, 2018) as well as some discussion related to the inductive bias and boundedness of their estimator. I'd like to see a discussion of these things.


GILBO: One Metric to Measure Them All

Alemi, Alexander A., Fischer, Ian

Neural Information Processing Systems

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data-independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs each trained on four datasets (MNIST, FashionMNIST, CIFAR-10 and CelebA) and discuss the results. Papers published at the Neural Information Processing Systems Conference.


GILBO: One Metric to Measure Them All

Alemi, Alexander A., Fischer, Ian

Neural Information Processing Systems

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data-independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs each trained on four datasets (MNIST, FashionMNIST, CIFAR-10 and CelebA) and discuss the results.


GILBO: One Metric to Measure Them All

Alemi, Alexander A., Fischer, Ian

Neural Information Processing Systems

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data-independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs each trained on four datasets (MNIST, FashionMNIST, CIFAR-10 and CelebA) and discuss the results.


GILBO: One Metric to Measure Them All

Alemi, Alexander A., Fischer, Ian

arXiv.org Machine Learning

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs trained on MNIST and discuss the results.