Review for NeurIPS paper: A Computational Separation between Private Learning and Online Learning
–Neural Information Processing Systems
Summary and Contributions: This work shows that there is a class that is privately PAC learnable in polynomial time, but not efficiently learnable (assuming the existence of one-way functions) in the online setting, i.e., there is no polynomial time algorithm with a polynomial mistake bound. A line of recent work (focused on sample complexity) has demonstrated various equivalences between private PAC and online learning; this result focuses on the question of efficiency, proving that efficient private learnability does not imply efficient online learnability if one-way functions exist. The cryptographic assumption is standard for such results and is needed when ruling out general polynomial time learners. The class of functions that cannot be efficiently learned in the online model was given by [Blum 1994], which showed a separation between distribution free PAC learning and online learning. Much of the work toward separating the models of learning, which involves working out the cryptographic construction and giving an efficient PAC algorithm, was done there -- the technical contribution here is to give a private version of that algorithm.
Neural Information Processing Systems
Feb-7-2025, 22:37:32 GMT