Stein Points
Chen, Wilson Ye, Mackey, Lester, Gorham, Jackson, Briol, François-Xavier, Oates, Chris J.
An important task in computational statistics and machine learning is to approximate a posterior distribution $p(x)$ with an empirical measure supported on a set of representative points $\{x_i\}_{i=1}^n$. This paper focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when $n$ is small. To this end, we present `Stein Points'. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and $p(x)$. Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.
Mar-27-2018
- Country:
- Europe > United Kingdom
- England (0.14)
- North America > United States
- California (0.14)
- Europe > United Kingdom
- Genre:
- Research Report > New Finding (0.48)
- Technology: