Reviews: Flexible and accurate inference and learning for deep generative models
–Neural Information Processing Systems
This paper presents an alternative to variational autoencoders and other generative models with latent variables that rely on the wake-sleep algorithm for training. The main problem with the wake-sleep algorithm is its bias: the recognition model has different conditional independencies than the generative model and it's trained to optimize a different objective. DDC-HM solves this by instead working with sufficient statistics, and using those to implicitly define the maximum entropy distribution consistent with those statistics. The measurements chosen are random functions. The methods are evaluated on synthetic data and two small vision datasets (image patches and MNIST), comparing against two baselines using the MMD metric. I don't know the related work well enough to evaluate the novelty with confidence.
Neural Information Processing Systems
Oct-7-2024, 19:57:21 GMT
- Technology: