f-GANs in an Information Geometric Nutshell
Nock, Richard, Cranko, Zac, Menon, Aditya K., Qu, Lizhen, Williamson, Robert C.
–Neural Information Processing Systems
The approach is elegant but falls short of a full description of the supervised game, and says little about the key player, the generator: for example, what does the generator actually converge to if solving the GAN game means convergence in some space of parameters? How does that provide hints on the generator's design and compare to the flourishing but almost exclusively experimental literature on the subject? In this paper, we unveil a broad class of distributions for which such convergence happens --- namely, deformed exponential families, a wide superset of exponential families ---. We show that current deep architectures are able to factorize a very large number of such densities using an especially compact design, hence displaying the power of deep architectures and their concinnity in the $f$-GAN game. This result holds given a sufficient condition on \textit{activation functions} --- which turns out to be satisfied by popular choices.
Neural Information Processing Systems
Feb-14-2020, 05:43:08 GMT
- Technology: