Sample complexity of Schrödinger potential estimation

Puchkin, Nikita, Pustovalov, Iurii, Sapronov, Yuri, Suchkov, Denis, Naumov, Alexey, Belomestny, Denis

arXiv.org Machine Learning 

We address the problem of Schrödinger potential estimation, which plays a crucial role in modern generative modelling approaches based on Schrödinger bridges and stochastic optimal control for SDEs. Given a simple prior diffusion process, these methods search for a path between two given distributions $ρ_0$ and $ρ_T^*$ requiring minimal efforts. The optimal drift in this case can be expressed through a Schrödinger potential. In the present paper, we study generalization ability of an empirical Kullback-Leibler (KL) risk minimizer over a class of admissible log-potentials aimed at fitting the marginal distribution at time $T$. Under reasonable assumptions on the target distribution $ρ_T^*$ and the prior process, we derive a non-asymptotic high-probability upper bound on the KL-divergence between $ρ_T^*$ and the terminal density corresponding to the estimated log-potential. In particular, we show that the excess KL-risk may decrease as fast as $O(\log^2 n / n)$ when the sample size $n$ tends to infinity even if both $ρ_0$ and $ρ_T^*$ have unbounded supports.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found