Stochastic Modified Flows for Riemannian Stochastic Gradient Descent
Gess, Benjamin, Kassing, Sebastian, Rana, Nimit
–arXiv.org Artificial Intelligence
We give quantitative estimates for the rate of convergence of Riemannian stochastic gradient descent (RSGD) to Riemannian gradient flow and to a diffusion process, the so-called Riemannian stochastic modified flow (RSMF). Using tools from stochastic differential geometry we show that, in the small learning rate regime, RSGD can be approximated by the solution to the RSMF driven by an infinite-dimensional Wiener process. The RSMF accounts for the random fluctuations of RSGD and, thereby, increases the order of approximation compared to the deterministic Riemannian gradient flow. The RSGD is build using the concept of a retraction map, that is, a cost efficient approximation of the exponential map, and we prove quantitative bounds for the weak error of the diffusion approximation under assumptions on the retraction map, the geometry of the manifold, and the random estimators of the gradient.
arXiv.org Artificial Intelligence
Feb-2-2024
- Country:
- Asia
- Japan > Honshū
- Kantō > Tokyo Metropolis Prefecture > Tokyo (0.04)
- Middle East > Jordan (0.04)
- Japan > Honshū
- Europe
- Germany > Saxony
- Leipzig (0.04)
- Netherlands
- North Holland > Amsterdam (0.04)
- South Holland > Dordrecht (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Germany > Saxony
- North America > United States
- New Jersey > Mercer County
- Princeton (0.04)
- New York (0.04)
- New Jersey > Mercer County
- Asia
- Genre:
- Research Report (0.40)
- Technology: