Pointwise Bounds for Distribution Estimation under Communication Constraints

Neural Information Processing Systems 

We consider the problem of estimating a d -dimensional discrete distribution from its samples observed under a b -bit communication constraint. In contrast to most previous results that largely focus on the global minimax error, we study the local behavior of the estimation error and provide \emph{pointwise} bounds that depend on the target distribution p . In particular, we show that the \ell_2 error decays with O\left(\frac{\lVert p\rVert_{1/2}}{n2 b}\vee \frac{1}{n}\right) when n is sufficiently large, hence it is governed by the \emph{half-norm} of p instead of the ambient dimension d . For the achievability result, we propose a two-round sequentially interactive estimation scheme that achieves this error rate uniformly over all p . Our scheme is based on a novel local refinement idea, where we first use a standard global minimax scheme to localize p and then use the remaining samples to locally refine our estimate.We also develop a new local minimax lower bound with (almost) matching \ell_2 error, showing that any interactive scheme must admit a \Omega\left( \frac{\lVert p \rVert_{{(1 \delta)}/{2}}}{n2 b}\right) \ell_2 error for any \delta 0 when n is sufficiently large.