Goto

Collaborating Authors

After 14 years, Steam finally gets some decent privacy settings

Mashable

Hardcore gamers won't have to worry anymore about people finding out they've played The Sims for 200 hours over the past two weeks. Valve announced new privacy settings for Steam users Tuesday, allowing people to change how much information others can see in their profiles and who is allowed to see it. Valve also said that in the future, an "invisible" feature will be added that will allow you to appear offline but still use the service's online features. SEE ALSO: In'Far Cry 5,' when the going gets weird, the game gets awesome You can find the new privacy settings pretty easily: Go to your profile page, click the Edit Profile button in the top right, then select My Privacy Settings on the right side of the page. Here's what the privacy settings look like now: As you can see, you can set your profile to be viewable by the public, only your friends, or set it to private.


Local Differential Privacy for Evolving Data

Neural Information Processing Systems

There are now several large scale deployments of differential privacy used to collect statistical information about users. However, these deployments periodically recollect the data and recompute the statistics using algorithms designed for a single use. As a result, these systems do not provide meaningful privacy guarantees over long time scales. Moreover, existing techniques to mitigate this effect do not apply in the local model'' of differential privacy that these systems use. In this paper, we introduce a new technique for local differential privacy that makes it possible to maintain up-to-date statistics over time, with privacy guarantees that degrade only in the number of changes in the underlying distribution rather than the number of collection periods.


Differentially Private Bagging: Improved utility and cheaper privacy than subsample-and-aggregate

Neural Information Processing Systems

Differential Privacy is a popular and well-studied notion of privacy. In the era ofbig data that we are in, privacy concerns are becoming ever more prevalent and thusdifferential privacy is being turned to as one such solution. A popular method forensuring differential privacy of a classifier is known as subsample-and-aggregate,in which the dataset is divided into distinct chunks and a model is learned on eachchunk, after which it is aggregated. This approach allows for easy analysis of themodel on the data and thus differential privacy can be easily applied. In this paper,we extend this approach by dividing the data several times (rather than just once)and learning models on each chunk within each division.


Are You Afraid of Data? Balancing Privacy and Data Monetization - CPO Magazine

#artificialintelligence

In light of the many news headlines and scandals over data privacy today, many companies are afraid to use customer data because of concerns over potential privacy violations. There is also a growing concern over being legally compliant but still making customers unhappy or uncomfortable, much like what happened with Target in 2012. Target legally used their customers' data to create targeted ads, but the personal nature of the ads still upset customers. Target wasn't doing anything wrong with data monetization, but it still negatively impacted customers. Many companies opt not to use data out of fear, but that comes at a huge loss of revenue.


Privacy Odometers and Filters: Pay-as-you-Go Composition

Neural Information Processing Systems

In this paper we initiate the study of adaptive composition in differential privacy when the length of the composition, and the privacy parameters themselves can be chosen adaptively, as a function of the outcome of previously run analyses. This case is much more delicate than the setting covered by existing composition theorems, in which the algorithms themselves can be chosen adaptively, but the privacy parameters must be fixed up front. Indeed, it isn't even clear how to define differential privacy in the adaptive parameter setting. We proceed by defining two objects which cover the two main use cases of composition theorems. A privacy filter is a stopping time rule that allows an analyst to halt a computation before his pre-specified privacy budget is exceeded.