d6ef5f7fa914c19931a55bb262ec879c-Reviews.html

Neural Information Processing Systems 

This paper gives minimax lower bound for the number of bits, B, needed to be communicated between m estimators, each of which has acessess to n iid samples, such that the estimation error is on par with the centralized case (with mn samples). This is a very interesting problem of theoretic and practical significance. In view of the fact that the minimax rate without communication constraints which does not admit a general solution, I do not expect a general solution for the minimal B. As the authors shows the conclusion seems problem-specific and can be quite pessimistic, meaning a large amount of communication overhead is necessary. While the proof of Thm 2 is quite trivial (a vanilla application of Fano's inequality), the proof of Thm 3 is quite interesting which relies on a type of strong data processing inequality for mutual information under some bounded density condition, which is reminiscient of differential privacy and Thm 1 in J. C. Duchi, M. I. Jordan and M. J. Wainwright (2013). Whether one need even bigger B in order to reach the rate at sample size mn - I wonder if there is any problem in multi-user information theory that is similar in spirit, where a joint decoder has access to rate-constrained coded messages and want to recover the original source with some fidelity.