Differentially private inference via noisy optimization
Avella-Medina, Marco, Bradshaw, Casey, Loh, Po-Ling
Over the last decade, differential privacy has evolved from a rigorous paradigm derived by theoretical computer scientists for releasing sensitive data to a technology deployed at scale in numerous applications [Ding et al., 2017, Erlingsson et al., 2014, Garfinkel et al., 2019, Tang et al., 2017]. The setting assumes the existence of a trusted curator who holds the data of individuals in a database, and the goal of privacy is to simultaneously protect individual data while allowing statistical analysis of the aggregate database. Such protection is guaranteed by differential privacy in the context of a remote access query system, where a statistician can only indirectly access the data, e.g., by obtaining noisy summary statistics or outputs of a model. Injecting noise before releasing information to the statistician is essential for preserving privacy, and the noise should be as small as possible in order to optimize statistical performance of the released statistics. In this paper, we consider the problem of estimation and inference for M-estimators. Inspired by the work of Bassily et al. [2014], Lee and Kifer [2018], Song et al. [2013], and Feldman et al. [2020], among others, we propose noisy optimization procedures that output differentially private counterparts of standard M-estimators. The central idea of these methods is to add noise to every iterate of a gradient-based optimization routine in a way that causes each iterate to satisfy a targeted differential privacy guarantee.
Mar-19-2021
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.14)
- North America > United States
- California (0.04)
- New York (0.04)
- Asia > Middle East
- Genre:
- Research Report
- Experimental Study (0.68)
- New Finding (0.46)
- Research Report
- Industry:
- Information Technology > Security & Privacy (1.00)
- Technology: