Reviews: Dimension-Free Iteration Complexity of Finite Sum Optimization Problems

Neural Information Processing Systems 

Technical quality: The proofs derived in the paper are sound and well presented. One of the most interesting contributions is the lower bound for stochastic methods (including Stochastic Gradient Descent) which uses Yao's minimax principle, a neat and simple trick. The paper also provides some new insights, e.g. Novelty/originality: Although the lower-bounds derived in this paper are of significant interest, I nevertheless have some concern with the current way the paper is written, especially concerning the differences to [5] that are not clearly stated in the paper. Although the authors seem to imply that they are the first one to derive dimension-free bounds, the work of [5] already derived lower bounds that hold independently of the dimension.