In this paper we initiate the study of adaptive composition in differential privacy when the length of the composition, and the privacy parameters themselves can be chosen adaptively, as a function of the outcome of previously run analyses. This case is much more delicate than the setting covered by existing composition theorems, in which the algorithms themselves can be chosen adaptively, but the privacy parameters must be fixed up front. Indeed, it isn't even clear how to define differential privacy in the adaptive parameter setting. We proceed by defining two objects which cover the two main use cases of composition theorems. A privacy filter is a stopping time rule that allows an analyst to halt a computation before his pre-specified privacy budget is exceeded. A privacy odometer allows the analyst to track realized privacy loss as he goes, without needing to pre-specify a privacy budget. We show that unlike the case in which privacy parameters are fixed, in the adaptive parameter setting, these two use cases are distinct. We show that there exist privacy filters with bounds comparable (up to constants) with existing privacy composition theorems. We also give a privacy odometer that nearly matches non-adaptive private composition theorems, but is sometimes worse by a small asymptotic factor. Moreover, we show that this is inherent, and that any valid privacy odometer in the adaptive parameter setting must lose this factor, which shows a formal separation between the filter and odometer use-cases.
The United States is the world's leading exporter of IT services and Europe has the largest customer base for such services. It is in the core interest of the U.S. to ensure that these services can be used in Europe. Privacy laws, however, have the potential to obstruct their export. Not surprisingly, the U.S. government and businesses from the U.S. were heavily engaged in the privacy reform debate in the EU. They aimed to avoid overreaching regulation and to ensure that American businesses would no longer have to deal with 28 fragmented national laws.
Over the past several weeks, many people have been bombarded with emails about data privacy from major corporations such as Twitter and Facebook. There's a reason all these businesses are updating their privacy policies–and, though you may be tempted to trash those emails, they carry news of real change. The companies sending them have until May 25 to comply with a new privacy law enacted by the European Union, known as the General Data Protection Regulation (GDPR). The E.U. guidelines limit how companies can use and process the personal data of consumers, giving ordinary people more control over their own information. Under the GDPR, corporations need to explicitly ask if they can collect your data, they're required to answer if you inquire what that data is used for, and they must give you the right to permanently delete that information.
In my last post I covered reasons why providing information security and privacy training, and ongoing awareness reminders, is so important. Now I want to cover three important facts to keep in mind to make your education efforts effective, as well as to meet associated legal requirements. I've believed and practiced this for a very long time! In fact, I created my training packages (such as Security Search and my online SIMBUS training modules) and have provided my on-site and live online training with this very concept in mind. Participants in training and awareness MUST be able to see how the issues relate to them in order to pay attention, and really understand the security and privacy issues, and then carry those lessons learned into their daily work activities.