Efficient Learning of Stationary Diffusions with Stein-type Discrepancies
Bleile, Fabian, Lumpp, Sarah, Drton, Mathias
Learning a stationary diffusion amounts to estimating the parameters of a stochastic differential equation whose stationary distribution matches a target distribution. We build on the recently introduced kernel deviation from stationarity (KDS), which enforces stationarity by evaluating expectations of the diffusion's generator in a reproducing kernel Hilbert space. Leveraging the connection between KDS and Stein discrepancies, we introduce the Stein-type KDS (SKDS) as an alternative formulation. We prove that a vanishing SKDS guarantees alignment of the learned diffusion's stationary distribution with the target. Furthermore, under broad parametrizations, SKDS is convex with an empirical version that is $ε$-quasiconvex with high probability. Empirically, learning with SKDS attains comparable accuracy to KDS while substantially reducing computational cost and yields improvements over the majority of competitive baselines.
Jan-30-2026
- Country:
- Africa > Middle East
- Morocco > Tanger-Tetouan-Al Hoceima Region > Tangier (0.04)
- Asia > Japan
- Honshū > Kantō > Kanagawa Prefecture (0.04)
- Europe
- Germany > Bavaria
- Upper Bavaria > Munich (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Germany > Bavaria
- North America > United States
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New York (0.04)
- Massachusetts > Middlesex County
- South America > Brazil
- Rio de Janeiro > Rio de Janeiro (0.04)
- Africa > Middle East
- Genre:
- Research Report (1.00)
- Technology: