Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models
Alsup, Terrence, Hartland, Tucker, Peherstorfer, Benjamin, Petra, Noemi
–arXiv.org Artificial Intelligence
Bayesian inference is a ubiquitous and flexible tool for updating a belief (i.e., learning) about a quantity of interest when data are observed, which ultimately can be used to inform downstream decision-making. In particular, Bayesian inverse problems allow one to derive knowledge from data through the lens of physicsbased models. These problems can be formulated as follows: given observational data, a physics-based model, and prior information about the model inputs, find a posterior probability distribution for the inputs that reflects the knowledge about the inputs in terms of the observed data and prior. Typically, the physicsbased models are given in the form of an input-to-observation map that is based on a system of partial differential equations (PDEs). The computational task underlying Bayesian inference is approximating posterior probability distributions to compute expectations and to quantify uncertainties. There are multiple ways of computationally exploring posterior distributions to gain insights, reaching from Markov chain Monte Carlo to variational methods [24, 42, 28]. In this work, we make use of Stein variational gradient descent (SVGD) [32], which is a method for particle-based variational inference, to approximate posterior distributions. It builds on Stein's identity to formulate an update step for the particles that can be realized numerically in an efficient manner via
arXiv.org Artificial Intelligence
Apr-29-2023