Reviews: Tight Bounds for Collaborative PAC Learning via Multiplicative Weights

Neural Information Processing Systems 

This paper nearly settles (effectively settles, for a large range of parameters) an obvious question: " what is the overhead (as a function of k, the number of underlying distributions) in the sample complexity of collaborative PAC learning, compared to vanilla PAC learning?" An easy upper bound is a factor of k. Blum et al. established an upper bound of O(log 2 k) on the overhead factor (for k O(d), where d is the VC dimension of the concept class to learn), and a lower bound of Omega(log k) (for the specific case k d). The main contribution of this paper is to provide an O(log k) upper bound (for k O(d), again; the general upper bound is slightly more complicated for general k) on that ratio; they also generalize the lower bound to hold for any k d {O(1)}. The lower bound is aobtained by generalized and "bootstarting" the lower bound construction of Blum et al., with some changes to handle the base case.