Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence
–Neural Information Processing Systems
We achieve this by compressing the gradient information before it is fed into the optimizer state, thereby reducing its memory footprint significantly. We control the resulting compression error via a novel instance of the classical error feedback mechanism from distributed optimization [Seide et al., 2014, Alistarh et al., 2018, Karimireddy et al., 2019] in which the error correction information is itself compressed to allow for practical memory gains. We prove that the resulting approach maintains theoretical convergence guarantees competitive to those of AMSGrad, while providing good practical performance.
Neural Information Processing Systems
May-28-2025, 06:02:31 GMT
- Country:
- Europe (0.14)
- Genre:
- Research Report
- Experimental Study (1.00)
- New Finding (1.00)
- Research Report
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning > Neural Networks (0.68)
- Natural Language > Large Language Model (0.47)
- Representation & Reasoning (0.93)
- Vision (0.92)
- Information Technology > Artificial Intelligence