Sharp Analysis of Stochastic Optimization under Global Kurdyka-Łojasiewicz Inequality
–Neural Information Processing Systems
We study the complexity of finding the global solution to stochastic nonconvex optimization when the objective function satisfies global Kurdyka-Łojasiewicz (KŁ) inequality and the queries from stochastic gradient oracles satisfy mild expected smoothness assumption. We first introduce a general framework to analyze Stochastic Gradient Descent (SGD) and its associated nonlinear dynamics under the setting.
Neural Information Processing Systems
Feb-6-2025, 06:59:42 GMT
- Genre:
- Research Report > New Finding (0.45)