privacy group
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Oregon > Multnomah County > Portland (0.04)
- Europe > Germany (0.04)
- Asia (0.04)
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Oregon > Multnomah County > Portland (0.04)
- Europe > Germany (0.04)
- Asia (0.04)
Have it your way: Individualized Privacy Assignment for DP-SGD
Boenisch, Franziska, Mühl, Christopher, Dziedzic, Adam, Rinberg, Roy, Papernot, Nicolas
This budget represents a maximal privacy violation that any user is willing to face by contributing their data to the training set. We argue that this approach is limited because different users may have different privacy expectations. Thus, setting a uniform privacy budget across all points may be overly conservative for some users or, conversely, not sufficiently protective for others. In this paper, we capture these preferences through individualized privacy budgets. To demonstrate their practicality, we introduce a variant of Differentially Private Stochastic Gradient Descent (DP-SGD) which supports such individualized budgets. DP-SGD is the canonical approach to training models with differential privacy. We modify its data sampling and gradient noising mechanisms to arrive at our approach, which we call Individualized DP-SGD (IDP-SGD). Because IDP-SGD provides privacy guarantees tailored to the preferences of individual users and their data points, we find it empirically improves privacy-utility trade-offs.
- Information Technology > Security & Privacy (1.00)
- Government (0.67)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (0.70)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning > Gradient Descent (0.54)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)
Individualized PATE: Differentially Private Machine Learning with Individual Privacy Guarantees
Boenisch, Franziska, Mühl, Christopher, Rinberg, Roy, Ihrig, Jannis, Dziedzic, Adam
Applying machine learning (ML) to sensitive domains requires privacy protection of the underlying training data through formal privacy frameworks, such as differential privacy (DP). Yet, usually, the privacy of the training data comes at the cost of the resulting ML models' utility. One reason for this is that DP uses one uniform privacy budget epsilon for all training data points, which has to align with the strictest privacy requirement encountered among all data holders. In practice, different data holders have different privacy requirements and data points of data holders with lower requirements can contribute more information to the training process of the ML models. To account for this need, we propose two novel methods based on the Private Aggregation of Teacher Ensembles (PATE) framework to support the training of ML models with individualized privacy guarantees. We formally describe the methods, provide a theoretical analysis of their privacy bounds, and experimentally evaluate their effect on the final model's utility using the MNIST, SVHN, and Adult income datasets. Our empirical results show that the individualized privacy methods yield ML models of higher accuracy than the non-individualized baseline. Thereby, we improve the privacy-utility trade-off in scenarios in which different data holders consent to contribute their sensitive data at different individual privacy levels.
- North America > Canada > Ontario > Toronto (0.14)
- Europe > Germany > Berlin (0.04)
- North America > United States > New York (0.04)
- Asia (0.04)
- Research Report > New Finding (0.66)
- Research Report > Experimental Study (0.45)
Amazon installs AI-powered cameras in UK delivery vans
Last year, it was reported that Amazon planned to use AI-equipped cameras to surveil delivery drivers on their routes. Now, the company has started installing such cameras on its vans in the UK, according to The Telegraph. The action has created concern from privacy groups who called it "excessive" and "creepy." Amazon will use a pair of cameras to record footage from inside vans and out to the road. They're designed to detect road violations or poor driver practices and give an audio alert, while collecting data Amazon can use later to evaluate drivers.
- Transportation > Freight & Logistics Services (0.73)
- Transportation > Ground > Road (0.53)
Facebook to shutter its facial recognition system, citing 'societal concerns'
Facebook is shutting down its facial recognition program and deleting more than 1 billion users' faceprints, a company official said Tuesday. The move means more than one-third of Facebook's daily active users – about 640 million people – who have opted into the social network's facial recognition option no longer will be automatically recognized in photos and videos, said Jerome Pesenti, vice president of artificial intelligence at Meta, the newlynamed parent company of Facebook, in a blog post. Also affected: Facebook's automatic alt text system, which uses facial recognition and artificial intelligence to give those who are blind or visually impaired descriptions of images that let them know when they or a friend are in an image. Facebook is taking this action, Pesenti said, because "the many specific instances where facial recognition can be helpful need to be weighed against growing concerns about the use of this technology as a whole." In addition to societal concerns about how facial recognition may be used, "regulators are still in the process of providing a clear set of rules governing its use," he said.
- North America > United States > Oregon (0.05)
- North America > United States > New Hampshire (0.05)
- North America > United States > California (0.05)
- Europe > France (0.05)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Vision > Face Recognition (1.00)
You could finally control your Facebook data if UK law is passed
Britons might soon be able to request that their embarrassing social media posts be taken down and records of their existence wiped, according to new proposals outlined today. The new bill will transfer the European Union's General Data Protection Regulation into UK law, as well as making a few additions and amendments. It's currently possible to delete any of your own posts manually, but that doesn't necessarily remove the information from social media companies' databases. According to Facebook's terms and conditions, "some things can only be deleted when you permanently delete your account." While not all requests for deletion will be granted – companies can decline on the grounds of freedom of expression, and when the information of scientific or historical importance – those involving information posted by or collected from children will nearly always be honoured.