Byzantine Stochastic Gradient Descent
–Neural Information Processing Systems
This paper studies the problem of distributed stochastic optimization in an adversarial setting where, out of $m$ machines which allegedly compute stochastic gradients every iteration, an $\alpha$-fraction are Byzantine, and may behave adversarially.
Neural Information Processing Systems
Mar-16-2026, 23:29:07 GMT
- Technology: