Random Feature Stein Discrepancies
Jonathan Huggins, Lester Mackey
–Neural Information Processing Systems
Computable Stein discrepancies have been deployed for a variety of applications, ranging from sampler selection in posterior inference to approximate Bayesian inference to goodness-of-fit testing. Existing convergence-determining Stein discrepancies admit strong theoretical guarantees but suffer from a computational cost that grows quadratically in the sample size. While linear-time Stein discrepancies have been proposed for goodness-of-fit testing, they exhibit avoidable degradations in testing power--even when power is explicitly optimized. To address these shortcomings, we introduce feature Stein discrepancies (SDs), a new family of quality measures that can be cheaply approximated using importance sampling. We show how to construct SDs that provably determine the convergence of a sample to its target and develop high-accuracy approximations--random SDs (R SDs)--which are computable in near-linear time. In our experiments with sampler selection for approximate posterior inference and goodness-of-fit testing, R SDs perform as well or better than quadratic-time KSDs while being orders of magnitude faster to compute.
Neural Information Processing Systems
May-26-2025, 04:19:10 GMT
- Country:
- North America > United States
- New York (0.14)
- Rhode Island (0.14)
- North America > United States
- Genre:
- Research Report > New Finding (0.48)
- Technology: