Goto

Collaborating Authors

 byzantine-resilient zero-order optimization


Byzantine-Resilient Zero-Order Optimization for Communication-Efficient Heterogeneous Federated Learning

Egger, Maximilian, Bakshi, Mayank, Bitar, Rawad

arXiv.org Machine Learning

We introduce CyBeR-0, a Byzantine-resilient federated zero-order optimization method that is robust under Byzantine attacks and provides significant savings in uplink and downlink communication costs. We introduce transformed robust aggregation to give convergence guarantees for general non-convex objectives under client data heterogeneity. Empirical evaluations for standard learning tasks and fine-tuning large language models show that CyBeR-0 exhibits stable performance with only a few scalars per-round communication cost and reduced memory requirements.