deletion guarantee
AdaptiveMachineUnlearning
However,for sequences ofdeletions, most prior work inthe non-convexsetting gives valid guarantees only for sequences that are chosenindependently of the models that are published. If people choose to delete their data as a function of the published models (because they don't like what the models reveal about them, for example), then the update sequence isadaptive.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- North America > United States > Pennsylvania (0.04)
- North America > United States > Colorado > Denver County > Denver (0.04)
Can AI Learn to Forget?
Machine learning has emerged as a valuable tool for spotting patterns and trends that might otherwise escape humans. The technology, which can build elaborate models based on everything from personal preferences to facial recognition, is used widely to understand behavior, spot patterns and trends, and make informed predictions. Yet for all the gains, there is also plenty of pain. A major problem associated with machine learning is that once an algorithm or model exists, expunging individual records or chunks of data is extraordinarily difficult. In most cases, it is necessary to retrain the entire model--sometimes with no assurance that that model will not continue to incorporate the suspect data in some way, says Gautam Kamath, an assistant professor in the David R. Cheriton School of Computer Science at the University of Waterloo in Canada.
- North America > Canada > Ontario > Toronto (0.16)
- North America > United States > Pennsylvania (0.05)
- North America > United States > Wisconsin > Dane County > Madison (0.05)
- (2 more...)
- Information Technology > Security & Privacy (1.00)
- Law (0.73)
Adaptive Machine Unlearning
Gupta, Varun, Jung, Christopher, Neel, Seth, Roth, Aaron, Sharifi-Malvajerdi, Saeed, Waites, Chris
Data deletion algorithms aim to remove the influence of deleted data points from trained models at a cheaper computational cost than fully retraining those models. However, for sequences of deletions, most prior work in the non-convex setting gives valid guarantees only for sequences that are chosen independently of the models that are published. If people choose to delete their data as a function of the published models (because they don't like what the models reveal about them, for example), then the update sequence is adaptive. In this paper, we give a general reduction from deletion guarantees against adaptive sequences to deletion guarantees against non-adaptive sequences, using differential privacy and its connection to max information. Combined with ideas from prior work which give guarantees for non-adaptive deletion sequences, this leads to extremely flexible algorithms able to handle arbitrary model classes and training methodologies, giving strong provable deletion guarantees for adaptive deletion sequences. We show in theory how prior work for non-convex models fails against adaptive deletion sequences, and use this intuition to design a practical attack against the SISA algorithm of Bourtoule et al. [2021] on CIFAR-10, MNIST, Fashion-MNIST.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- North America > United States > Pennsylvania (0.04)
- North America > United States > Colorado > Denver County > Denver (0.04)