Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis
–Neural Information Processing Systems
We study finite-sum distributed optimization problems involving a master node and n-1 local nodes under the popular \delta -similarity and \mu -strong convexity conditions. We propose two new algorithms, SVRS and AccSVRS, motivated by previous works. The non-accelerated SVRS method combines the techniques of gradient sliding and variance reduction and achieves a better communication complexity of \tilde{\mathcal{O}}(n { } \sqrt{n}\delta/\mu) compared to existing non-accelerated algorithms. In contrast to existing results, our complexity bounds are entirely smoothness-free and exhibit superiority in ill-conditioned cases. Furthermore, we establish a nearly matched lower bound to verify the tightness of our AccSVRS method.
Neural Information Processing Systems
Oct-9-2024, 10:16:30 GMT
- Technology: