On the Convergence to a Global Solution of Shuffling-Type Gradient Algorithms Lam M. Nguyen
–Neural Information Processing Systems
Stochastic gradient descent (SGD) algorithm is the method of choice in many machine learning tasks thanks to its scalability and efficiency in dealing with large-scale problems. In this paper, we focus on the shuffling version of SGD which matches the mainstream practical heuristics. We show the convergence to a global solution of shuffling SGD for a class of non-convex functions under over-parameterized settings.
Neural Information Processing Systems
Feb-17-2026, 20:41:06 GMT
- Country:
- Asia > Russia (0.04)
- Europe
- Finland > Uusimaa
- Helsinki (0.04)
- Netherlands > South Holland
- Dordrecht (0.04)
- Russia (0.04)
- Finland > Uusimaa
- North America
- Canada
- Alberta > Census Division No. 15
- Improvement District No. 9 > Banff (0.04)
- British Columbia > Vancouver (0.04)
- Alberta > Census Division No. 15
- United States
- California (0.04)
- Georgia > Fulton County
- Atlanta (0.04)
- Massachusetts > Suffolk County
- Boston (0.04)
- New York > Tompkins County
- Ithaca (0.04)
- Canada
- Genre:
- Research Report (0.68)
- Technology: