Distributed Stochastic Zeroth-Order Optimization with Compressed Communication
Hua, Youqing, Liu, Shuai, Hong, Yiguang, Ren, Wei
–arXiv.org Artificial Intelligence
The dual challenges of prohibitive communication overhead and the impracticality of gradient computation due to data privacy or black-box constraints in distributed systems motivate this work on communication-constrained gradient-free optimization. We propose a stochastic distributed zeroth-order algorithm (Com-DSZO) requiring only two function evaluations per iteration, integrated with general compression operators. Rigorous analysis establishes its sublinear convergence rate for both smooth and nonsmooth objectives, while explicitly elucidating the compression-convergence trade-off. Furthermore, we develop a variance-reduced variant (VR-Com-DSZO) under stochastic mini-batch feedback. The empirical algorithm performance are illustrated with numerical examples.
arXiv.org Artificial Intelligence
Mar-21-2025