Goto

Collaborating Authors

 sensitivity mechanism


Instance-SpecificAsymmetricSensitivityin DifferentialPrivacy

Neural Information Processing Systems

While the inverse sensitivity mechanism was shown to be instance optimal, it was only with respect to a class of unbiased mechanisms such that the most likely outcome matches the underlying data.




Instance-Specific Asymmetric Sensitivity in Differential Privacy

Neural Information Processing Systems

While the inverse sensitivity mechanism was shown to be instance optimal, it was only with respect to a class of unbiased mechanisms such that the most likely outcome matches the underlying data.




Instance-Specific Asymmetric Sensitivity in Differential Privacy

Durfee, David

arXiv.org Machine Learning

We consider the general problem of estimating aggregate functions or statistics of a dataset with differential privacy. The massive increase in data collection to improve analytics and modelling across industries has made such data computations invaluable, but can also leak sensitive individual information. Rigorously measuring such leakage can be achieved through differential privacy, which quantifies the extent that one individual's data can affect the output. Much of the focus within the field of differential privacy is upon constructing algorithms that give both accurate output and privacy guarantees by injecting specific types of randomness. One of the most canonical mechanisms for achieving this considers the maximum effect one individual's data could have upon the output of a given function, referred to as the sensitivity of the function, and adds proportional noise to the function output. In general, the notion of sensitivity plays a central role in many differentially private algorithms, directly affecting the accuracy of the output. While using the worst-case sensitivity across all potential datasets will ensure privacy guarantees, the utility can be improved by using variants of sensitivity that are specific to the underlying dataset. This notion was initially considered in Nissim et al. (2007), introducing smooth sensitivity, an interpolation between worst-case sensitivity and local sensitivity of the underlying data, by which noise could be added