DeepWKB: Learning WKB Expansions of Invariant Distributions for Stochastic Systems

Li, Yao, Liu, Yicheng, Wang, Shirou

arXiv.org Artificial Intelligence 

This paper introduces a novel deep learning method, called DeepWKB, for estimating the invariant distribution of randomly perturbed systems via its Wentzel-Kramers-Brillouin (WKB) approximation $u_ε(x) = Q(ε)^{-1} Z_ε(x) \exp\{-V(x)/ε\}$, where $V$ is known as the quasi-potential, $ε$ denotes the noise strength, and $Q(ε)$ is the normalization factor. By utilizing both Monte Carlo data and the partial differential equations satisfied by $V$ and $Z_ε$, the DeepWKB method computes $V$ and $Z_ε$ separately. This enables an approximation of the invariant distribution in the singular regime where $ε$ is sufficiently small, which remains a significant challenge for most existing methods. Moreover, the DeepWKB method is applicable to higher-dimensional stochastic systems whose deterministic counterparts admit non-trivial attractors. In particular, it provides a scalable and flexible alternative for computing the quasi-potential, which plays a key role in the analysis of rare events, metastability, and the stochastic stability of complex systems.