Byzantine Stochastic Gradient Descent Dan Alistarh

Neural Information Processing Systems 

This paper studies the problem of distributed stochastic optimization in an adversarial setting where, out of m machines which allegedly compute stochastic gradients every iteration, an -fraction are Byzantine, and may behave adversari-ally. Our main result is a variant of stochastic gradient descent (SGD) which finds " -approximate minimizers of convex functions in T = e O

Similar Docs  Excel Report  more

TitleSimilaritySource
None found