Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction Wei Jiang 1

Neural Information Processing Systems 

Sign stochastic gradient descent (signSGD) is a communication-efficient method that transmits only the sign of stochastic gradients for parameter updating.