NeverGoFullBatch (inStochasticConvexOptimization)

Neural Information Processing Systems 

We study the generalization performance of full-batch optimization algorithms for stochastic convex optimization: these are first-order methods that only access the exact gradient of the empirical risk (rather than gradients with respect to individual data points), that include a wide range of algorithms such as gradient descent, mirror descent, and their regularized and/or accelerated variants.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found