Goto

Collaborating Authors

 parameter server


Machine Learning with Adversaries: Byzantine Tolerant Gradient Descent

Peva Blanchard, El Mahdi El Mhamdi, Rachid Guerraoui, Julien Stainer

Neural Information Processing Systems

We study the resilience to Byzantine failures of distributed implementations of Stochastic Gradient Descent (SGD). So far, distributed machine learning frameworks have largely ignored the possibility of failures, especially arbitrary (i.e., Byzantine) ones.