A Full Adagrad algorithm with O(Nd) operations
Godichon-Baggioni, Antoine, Lu, Wei, Portier, Bruno
A novel approach is given to overcome the computational challenges of the full-matrix Adaptive Gradient algorithm (Full AdaGrad) in stochastic optimization. By developing a recursive method that estimates the inverse of the square root of the covariance of the gradient, alongside a streaming variant for parameter updates, the study offers efficient and practical algorithms for large-scale applications. This innovative strategy significantly reduces the complexity and resource demands typically associated with full-matrix methods, enabling more effective optimization processes. Moreover, the convergence rates of the proposed estimators and their asymptotic efficiency are given. Their effectiveness is demonstrated through numerical studies.
May-3-2024
- Country:
- Europe > France
- Normandy > Seine-Maritime
- Rouen (0.04)
- Île-de-France > Paris
- Paris (0.04)
- Normandy > Seine-Maritime
- North America > United States
- Colorado (0.04)
- Europe > France
- Genre:
- Research Report (1.00)
- Technology: