Reviews: Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite Sum Structure
–Neural Information Processing Systems
The paper proposes a method for optimization problems often found in machine learning tasks. The general loss function to minimize is of the form of a sum of smooth-convex functions associated with a convex regularization potential. The method is designed for the case of perturbation introduced in the data. Since the data sampling introduces a stochastic component Stochastic Gradient Descent (SGD) need of modifications for reducing the gradient variance [14,28]. In the case of perturbed data, such variance is magnified.
Neural Information Processing Systems
Oct-8-2024, 02:32:02 GMT
- Technology: