Implicit Regularization or Implicit Conditioning Exact Risk Trajectories of in High Dimensions
–Neural Information Processing Systems
Stochastic gradient descent (SGD) is a pillar of modern machine learning, serving as the go-to optimization algorithm for a diverse array of problems. While the empirical success of SGD is often attributed to its computational efficiency and favorable generalization behavior, neither effect is well understood and disentangling them remains an open problem.
Neural Information Processing Systems
Aug-19-2025, 16:02:05 GMT
- Country:
- Asia
- Middle East > Jordan (0.04)
- Russia (0.04)
- Europe
- Russia (0.04)
- Switzerland
- Basel-City > Basel (0.04)
- Zürich > Zürich (0.04)
- North America
- Canada
- United States > Massachusetts
- Suffolk County > Boston (0.04)
- Asia
- Genre:
- Research Report (0.46)
- Technology: