Supplementary Material for: Recursive Inference for Variational Autoencoders

Neural Information Processing Systems 

IAF: The autoregressive-based flow model for the encoder q(z|x) [5], which has richer expressiveness than VAE's post-Gaussian encoder. The number of flows is chosen from {1, 2, 4, 8}. HF: The Householder flow encoder model that represents the full covariance using the Householder transformation [18]. The number of flows is chosen from {1, 2, 4, 8}. ME: For a baseline comparison, we also consider the same mixture encoder model, but unlike our recursive mixture learning, the model is trained conventionally, end-to-end; all mixture components' parameters are updated simultaneously. The number of mixture components is chosen from {2, 3, 4, 5}. RME: Our proposed recursive mixture encoder model. We vary the number of the components to be added M from {1, 2, 3, 4}, leading to mixture order 2 to 5. In addition, we test our RME model modified to employ the previous Boosted VI's entropy regularization schemes.