Solving high-dimensional parabolic PDEs using the tensor train format
Richter, Lorenz, Sallandt, Leon, Nüsken, Nikolas
Many of the suggested High-dimensional partial differential equations algorithms perform remarkably well in practice and (PDEs) are ubiquitous in economics, science and some theoretical results proving beneficial approximation engineering. However, their numerical treatment properties of neural networks in the PDE setting are now poses formidable challenges since traditional gridbased available (Jentzen et al., 2018). Still, a complete picture methods tend to be frustrated by the curse of remains elusive, and the optimization aspect in particular dimensionality. In this paper, we argue that tensor continues to pose challenging and mostly open problems, trains provide an appealing approximation framework both in terms of efficient implementations and theoretical for parabolic PDEs: the combination of reformulations understanding. Most importantly for practical applications, in terms of backward stochastic differential neural network training using gradient descent type schemes equations and regression-type methods may often take a very long time to converge for complicated in the tensor format holds the promise of leveraging PDE problems.
Feb-23-2021