Byzantine Stochastic Gradient Descent
–Neural Information Processing Systems
This paper studies the problem of distributed stochastic optimization in an adversarial setting where, out of $m$ machines which allegedly compute stochastic gradients every iteration, an $\alpha$-fraction are Byzantine, and may behave adversarially.
Neural Information Processing Systems
Nov-20-2025, 22:44:00 GMT
- Technology: