Reviews: A Communication Efficient Stochastic Multi-Block Alternating Direction Method of Multipliers

Neural Information Processing Systems 

This paper considers a communication efficient distributed optimization algorithm based on ADMM for stochastic optimization. The main idea is to perform multiple steps (can be timevarying) of stochastic gradient updates before the agents communicate, and therefore improving the communication efficiency. The proposed algorithm is shown to converge (in objective value & constraint violation) under a general non-smooth, non-strongly convex settings with O(1/eps) communication rounds and O(1/eps 2) unbiased gradient oracle calls. Other setting such as smooth strongly convex and non-smooth strongly convex are also analyzed and presented. Using multiple steps is however not novel.