dp algorithm
- North America > United States (0.04)
- Europe > Italy > Sicily (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > United States > California > Santa Clara County > Mountain View (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Asia > Thailand > Bangkok > Bangkok (0.04)
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- North America > United States > California > Santa Clara County > Mountain View (0.04)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (2 more...)
User-Level Differential Privacy With Few Examples Per User
STOC 2023] obtained generic algorithms that work for various learning tasks. However, their focus was on the *example-rich* regime, where the users have so many examples that each user could themselves solve the problem. In this work we consider the *example-scarce* regime, where each user has only a few examples, and obtain the following results:* For approximate-DP, we give a generic transformation of any item-level DP algorithm to a user-level DP algorithm. Roughly speaking, the latter gives a (multiplicative) savings of $O_{\varepsilon,\delta}(\sqrt{m})$ in terms of the number of users required for achieving the same utility, where $m$ is the number of examples per user. This algorithm, while recovering most known bounds for specific problems, also gives new bounds, e.g., for PAC learning.
- North America > United States (0.04)
- Europe > Italy > Sicily (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > United States > California > Santa Clara County > Mountain View (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Asia > Thailand > Bangkok > Bangkok (0.04)
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- North America > United States > California > Santa Clara County > Mountain View (0.04)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (2 more...)