Not enough data to create a plot.
Try a different view from the menu above.
Jordan Awan
KNG: The K-Norm Gradient Mechanism
Matthew Reimherr, Jordan Awan
This paper presents a new mechanism for producing sanitized statistical summaries that achieve differential privacy, called the K-Norm Gradient Mechanism, or KNG. This new approach maintains the strong flexibility of the exponential mechanism, while achieving the powerful utility performance of objective perturbation. KNG starts with an inherent objective function (often an empirical risk), and promotes summaries that are close to minimizing the objective by weighting according to how far the gradient of the objective function is from zero. Working with the gradient instead of the original objective function allows for additional flexibility as one can penalize using different norms. We show that, unlike the exponential mechanism, the noise added by KNG is asymptotically negligible compared to the statistical error for many problems. In addition to theoretical guarantees on privacy and utility, we confirm the utility of KNG empirically in the settings of linear and quantile regression through simulations.
KNG: The K-Norm Gradient Mechanism
Matthew Reimherr, Jordan Awan
Elliptical Perturbations for Differential Privacy
Matthew Reimherr, Jordan Awan
We study elliptical distributions in locally convex vector spaces, and determine conditions when they can or cannot be used to satisfy differential privacy (DP). A requisite condition for a sanitized statistical summary to satisfy DP is that the corresponding privacy mechanism must induce equivalent probability measures for all possible input databases. We show that elliptical distributions with the same dispersion operator, C, are equivalent if the difference of their means lies in the Cameron-Martin space of C. In the case of releasing finite-dimensional summaries using elliptical perturbations, we show that the privacy parameter ɛ can be computed in terms of a one-dimensional maximization problem. We apply this result to consider multivariate Laplace, t, Gaussian, and K-norm noise. Surprisingly, we show that the multivariate Laplace noise does not achieve ɛ-DP in any dimension greater than one. Finally, we show that when the dimension of the space is infinite, no elliptical distribution can be used to give ɛ-DP; only (ɛ, δ)-DP is possible.
Elliptical Perturbations for Differential Privacy
Matthew Reimherr, Jordan Awan
We study elliptical distributions in locally convex vector spaces, and determine conditions when they can or cannot be used to satisfy differential privacy (DP). A requisite condition for a sanitized statistical summary to satisfy DP is that the corresponding privacy mechanism must induce equivalent probability measures for all possible input databases. We show that elliptical distributions with the same dispersion operator, C, are equivalent if the difference of their means lies in the Cameron-Martin space of C. In the case of releasing finite-dimensional summaries using elliptical perturbations, we show that the privacy parameter ɛ can be computed in terms of a one-dimensional maximization problem. We apply this result to consider multivariate Laplace, t, Gaussian, and K-norm noise. Surprisingly, we show that the multivariate Laplace noise does not achieve ɛ-DP in any dimension greater than one. Finally, we show that when the dimension of the space is infinite, no elliptical distribution can be used to give ɛ-DP; only (ɛ, δ)-DP is possible.