NeverGoFullBatch (inStochasticConvexOptimization)
–Neural Information Processing Systems
We study the generalization performance of full-batch optimization algorithms for stochastic convex optimization: these are first-order methods that only access the exact gradient of the empirical risk (rather than gradients with respect to individual data points), that include a wide range of algorithms such as gradient descent, mirror descent, and their regularized and/or accelerated variants.
Neural Information Processing Systems
Feb-11-2026, 07:21:36 GMT
- Technology: