Gradient Descent with Compressed Iterates
Khaled, Ahmed, Richtárik, Peter
We propose and analyze a new type of stochastic first order method: gradient descent with compressed iterates (GDCI). GDCI in each iteration first compresses the current iterate using a lossy randomized compression technique, and subsequently takes a gradient step. This method is a distillation of a key ingredient in the current practice of federated learning, where a model needs to be compressed by a mobile device before it is sent back to a server for aggregation. Our analysis provides a step towards closing the gap between the theory and practice of federated learning, and opens the possibility for many extensions.
Sep-10-2019
- Country:
- Africa > Middle East
- Egypt > Cairo Governorate > Cairo (0.04)
- Asia > Middle East
- Saudi Arabia > Mecca Province > Thuwal (0.04)
- North America > United States
- Virginia (0.04)
- Africa > Middle East
- Genre:
- Research Report (0.64)
- Technology: