Convergence rates for momentum stochastic gradient descent with noise of machine learning type
Gess, Benjamin, Kassing, Sebastian
–arXiv.org Artificial Intelligence
We consider the momentum stochastic gradient descent scheme (MSGD) and its continuous-in-time counterpart in the context of non-convex optimization. We show almost sure exponential convergence of the objective function value for target functions that are Lipschitz continuous and satisfy the Polyak-Lojasiewicz inequality on the relevant domain, and under assumptions on the stochastic noise that are motivated by overparameterized supervised learning applications.
arXiv.org Artificial Intelligence
Feb-7-2023
- Country:
- Asia > Russia (0.04)
- Europe
- Germany > Saxony
- Leipzig (0.04)
- Italy > Emilia-Romagna
- Metropolitan City of Bologna > Bologna (0.04)
- Montenegro (0.04)
- Russia (0.04)
- Germany > Saxony
- North America > United States (0.04)
- Genre:
- Research Report (0.50)
- Technology: