Derivatives of Stochastic Gradient Descent in parametric optimization
–Neural Information Processing Systems
We consider stochastic optimization problems where the objective depends on some parameter, as commonly found in hyperparameter optimization for instance. We investigate the behavior of the derivatives of the iterates of Stochastic Gradient Descent (SGD) with respect to that parameter and show that they are driven by an inexact SGD recursion on a different objective function, perturbed by the convergence of the original SGD. This enables us to establish that the derivatives of SGD converge to the derivative of the solution mapping in terms of mean squared error whenever the objective is strongly convex.
Neural Information Processing Systems
Mar-27-2025, 10:23:59 GMT
- Country:
- Europe > France > Provence-Alpes-Côte d'Azur (0.14)
- Genre:
- Research Report
- Experimental Study (1.00)
- New Finding (0.93)
- Research Report
- Technology: