Review for NeurIPS paper: Stochastic Normalizing Flows
–Neural Information Processing Systems
Additional Feedback: The abstract is a bit long and could probably be condensed, and would probably benefit from doing so. It might also be worthwhile to separate the title from the paragraph text rather than joining them as in e.g. Why not make the base distribution pZ? That is, pZ - pX under F. 50-52: Although the slash is being used to distinguish between two different cases, it's ambiguous because the terms could also be interpreted as the ratio between two KL divergences, as well as the ratio between two densities. On relating statistical physics to more classic ML: as you've promised, it would be nice to include a latent variable/variational bound interpretation (as Sohl-Dickstein et al 2015 'Deep Unsupervised Learning using Nonequilibrium Thermodynamics' do), and maybe also link to Deep Latent Gaussian Models (Rezende et al 2014 'Stochastic Backprop', Kingma et al, 'Autoencoding Variational Bayes').
Neural Information Processing Systems
Jan-23-2025, 19:06:35 GMT
- Technology: