Goto

Collaborating Authors

After 14 years, Steam finally gets some decent privacy settings

Mashable

Hardcore gamers won't have to worry anymore about people finding out they've played The Sims for 200 hours over the past two weeks. Valve announced new privacy settings for Steam users Tuesday, allowing people to change how much information others can see in their profiles and who is allowed to see it. Valve also said that in the future, an "invisible" feature will be added that will allow you to appear offline but still use the service's online features. SEE ALSO: In'Far Cry 5,' when the going gets weird, the game gets awesome You can find the new privacy settings pretty easily: Go to your profile page, click the Edit Profile button in the top right, then select My Privacy Settings on the right side of the page. Here's what the privacy settings look like now: As you can see, you can set your profile to be viewable by the public, only your friends, or set it to private.


Local Differential Privacy for Evolving Data

Neural Information Processing Systems

There are now several large scale deployments of differential privacy used to collect statistical information about users. However, these deployments periodically recollect the data and recompute the statistics using algorithms designed for a single use. As a result, these systems do not provide meaningful privacy guarantees over long time scales. Moreover, existing techniques to mitigate this effect do not apply in the local model'' of differential privacy that these systems use. In this paper, we introduce a new technique for local differential privacy that makes it possible to maintain up-to-date statistics over time, with privacy guarantees that degrade only in the number of changes in the underlying distribution rather than the number of collection periods.


Differentially Private Bagging: Improved utility and cheaper privacy than subsample-and-aggregate

Neural Information Processing Systems

Differential Privacy is a popular and well-studied notion of privacy. In the era ofbig data that we are in, privacy concerns are becoming ever more prevalent and thusdifferential privacy is being turned to as one such solution. A popular method forensuring differential privacy of a classifier is known as subsample-and-aggregate,in which the dataset is divided into distinct chunks and a model is learned on eachchunk, after which it is aggregated. This approach allows for easy analysis of themodel on the data and thus differential privacy can be easily applied. In this paper,we extend this approach by dividing the data several times (rather than just once)and learning models on each chunk within each division.


Role of AI in Data Protection and Privacy Strategies

#artificialintelligence

Privacy issues sit at the forefront of online activity, business actions, and government decisions. This is largely in response to the breaches, scandals, and personal data leaks that have eroded confidence in technology and information systems. The National Security Telecommunications Advisory Committee's (NSTAC) Report to the President on a Cybersecurity Moonshot says that privacy is a crucial component of cybersecurity and that we must flip the narrative to restore the trust Americans place in information systems. To achieve this, by 2028, Americans need to be "guaranteed" that technological advancements will no longer threaten privacy but will instead enhance privacy assurance through the safety and security of their personal data. One critical element in future technology advancements and online security is the increased development of artificial intelligence (AI).


Are You Afraid of Data? Balancing Privacy and Data Monetization - CPO Magazine

#artificialintelligence

In light of the many news headlines and scandals over data privacy today, many companies are afraid to use customer data because of concerns over potential privacy violations. There is also a growing concern over being legally compliant but still making customers unhappy or uncomfortable, much like what happened with Target in 2012. Target legally used their customers' data to create targeted ads, but the personal nature of the ads still upset customers. Target wasn't doing anything wrong with data monetization, but it still negatively impacted customers. Many companies opt not to use data out of fear, but that comes at a huge loss of revenue.