Goto

Collaborating Authors

 gbnf





Gradient Boosted Normalizing Flows

Neural Information Processing Systems

By chaining a sequence of differentiable invertible transformations, normalizing flows (NF) provide an expressive method of posterior approximation, exact density evaluation, and sampling. The trend in normalizing flow literature has been to devise deeper, more complex transformations to achieve greater flexibility. We propose an alternative: Gradient Boosted Normalizing Flows (GBNF) model a density by successively adding new NF components with gradient boosting. Under the boosting framework, each new NF component optimizes a weighted likelihood objective, resulting in new components that are fit to the suitable residuals of the previously trained components. The GBNF formulation results in a mixture model structure, whose flexibility increases as more components are added. Moreover, GBNFs offer a wider, as opposed to strictly deeper, approach that improves existing NFs at the cost of additional training---not more complex transformations. We demonstrate the effectiveness of this technique for density estimation and, by coupling GBNF with a variational autoencoder, generative modeling of images. Our results show that GBNFs outperform their non-boosted analog, and, in some cases, produce better results with smaller, simpler flows.




detailed critique, and we appreciate your help in presenting our work as best as possible

Neural Information Processing Systems

We want to thank the reviewers greatly for the time and effort put into these reviews. R4), and that the "conceptual difference to previous research is a big strength." "wider" approach to building normalizing flow-based models is more than just a way to improve performance, noting Our work uncovers challenges that are unique to boosting on normalizing flows. Only analytically invertible flows can be boosted for variational inference (Section 5.1, and Figure 2) In regards to R1 and R3's critique on further differentiating our work with boosted density estimation [Rosset-Segal, '02] and generative models [Grover-Ermon, '18]: We show that the change-of-variables formula can be recursively We felt that proofs of boosting's expressiveness to be outside R2 writes "there are two bottlenecks in NF expressivity--the base distribution We appreciate reviewers taking the time to check for correctness. We stand by Eq. (10), (c 1) (c 1)


Gradient Boosted Normalizing Flows

Neural Information Processing Systems

By chaining a sequence of differentiable invertible transformations, normalizing flows (NF) provide an expressive method of posterior approximation, exact density evaluation, and sampling. The trend in normalizing flow literature has been to devise deeper, more complex transformations to achieve greater flexibility. We propose an alternative: Gradient Boosted Normalizing Flows (GBNF) model a density by successively adding new NF components with gradient boosting. Under the boosting framework, each new NF component optimizes a weighted likelihood objective, resulting in new components that are fit to the suitable residuals of the previously trained components. The GBNF formulation results in a mixture model structure, whose flexibility increases as more components are added.