Review for NeurIPS paper: Online Convex Optimization Over Erdos-Renyi Random Networks

Neural Information Processing Systems 

The paper considers the problem of distributed online convex optimization over Erdos Renyi radnom graphs and provides analysis of online gradient descent algorithm in the full information setting and one point and two point bandit query models. Both convex Lipschitz and strongly convex losses are considered. For full information case the regret bound is shown to match the rates for classical online gradient descent in terms of Horizon T. For two point bandit query the rates are shown to match the optimal rates in classical case as well. For one point query a possibly sub-optimal rate of T 3/4 and T 2/3 are shown for Lipschitz and strongly convex setting. Overall the reviewers found the work interesting.