Goto

Collaborating Authors

 rejection


How to Approximate Inference with Subtractive Mixture Models

Zellinger, Lena, Branchini, Nicola, De Smet, Lennert, Elvira, Víctor, Malkin, Nikolay, Vergari, Antonio

arXiv.org Machine Learning

Classical mixture models (MMs) are widely used tractable proposals for approximate inference settings such as variational inference (VI) and importance sampling (IS). Recently, mixture models with negative coefficients, called subtractive mixture models (SMMs), have been proposed as a potentially more expressive alternative. However, how to effectively use SMMs for VI and IS is still an open question as they do not provide latent variable semantics and therefore cannot use sampling schemes for classical MMs. In this work, we study how to circumvent this issue by designing several expectation estimators for IS and learning schemes for VI with SMMs, and we empirically evaluate them for distribution approximation. Finally, we discuss the additional challenges in estimation stability and learning efficiency that they carry and propose ways to overcome them. Code is available at: https://github.com/april-tools/delta-vi.


Mining GOLD Samples for Conditional GANs

Sangwoo Mo, Chiheon Kim, Sungwoong Kim, Minsu Cho, Jinwoo Shin

Neural Information Processing Systems

Training GANs (including cGANs), however, are known to be often hard and highly unstable [46]. Numerous techniques have thus been proposed to tackle the issue from different angles, e.g., improving architectures [32, 56, 7], losses and regularizers [16, 38, 20] and other training heuristics [46, 51, 8].