Goto

Collaborating Authors

 variational


A General Method for Amortizing Variational Filtering

Neural Information Processing Systems

We introduce the variational filtering EM algorithm, a simple, general-purpose method for performing variational inference in dynamical latent variable models using information from only past and present variables, i.e. filtering. The algorithm is derived from the variational objective in the filtering setting and consists of an optimization procedure at each time step. By performing each inference optimization procedure with an iterative amortized inference model, we obtain a computationally efficient implementation of the algorithm, which we call amortized variational filtering. We present experiments demonstrating that this general-purpose method improves inference performance across several recent deep dynamical latent variable models.




f-DivergenceVariationalInference

Neural Information Processing Systems

For decades, the dominant paradigm for approximate Bayesian inferencep(z|x) = p(z,x)/p(x) has been Markov-Chain Monte-Carlo (MCMC) algorithms, which estimate the evidencep(x) = R p(z,x)dz via sampling. However, since sampling tends to be a slow and computationally intensive process, these sampling-based approximate inference methods fadewhendealing withthemodern probabilistic machine learning problems that usually involveverycomplexmodels, high-dimensional feature spaces andlargedatasets.



2 Background Diffusion models [53] are latent variable models of the formpθ(x0): = R

Neural Information Processing Systems

We show that diffusion models actually are capable of generating high quality samples, sometimes better than the published results on other types of generative models (Section 4). In addition, we show that a certain parameterization of diffusion models reveals an equivalence with denoising score matching over multiple noise levels during training and with annealed Langevin dynamics during sampling (Section 3.2) [55, 61].



A Detail of Architectures and Experimental Settings

Neural Information Processing Systems

A.1 Experimental Setting for Skin Lesion Classification T ask A.2 Experimental Setting for Spinal Cord Gray Matter Segmentation T ask The results are shown in Table 1 (a). The results are shown in Table 1 (b). Focal loss for dense object detection.



A Rigorous Link between Deep Ensembles and (Variational) Bayesian Methods

Neural Information Processing Systems

We establish the first mathematically rigorous link between Bayesian, variational Bayesian, and ensemble methods. A key step towards this it to reformulate the non-convex optimisation problem typically encountered in deep learning as a convex optimisation in the space of probability measures.