Reconstruction Attacks on Machine Unlearning: Simple Models are Vulnerable
Bertran, Martin, Tang, Shuai, Kearns, Michael, Morgenstern, Jamie, Roth, Aaron, Wu, Zhiwei Steven
–arXiv.org Artificial Intelligence
As model training on personal data becomes commonplace, there has been a growing literature on data protection in machine learning (ML), which includes at least two thrusts: Data Privacy The primary concern regarding data privacy in machine learning (ML) applications is that models might inadvertently reveal details about the individual data points used in their training. This type of privacy risk can manifest in various ways, ranging from membership inference attacks [27]--which only seek to confirm whether a specific individual's data was used in the training--to more severe reconstruction attacks [10] that attempt to recover entire data records of numerous individuals. To address these risks, algorithms that adhere to differential privacy standards [12] provide proven safeguards, specifically limiting the ability to infer information about individual training data. Machine Unlearning Proponents of data autonomy have advocated for individuals to have the right to decide how their data is used, including the right to retroactively ask that their data and its influences be removed from any model trained on it. Data deletion, or machine unlearning, refer to technical approaches which allow such removal of influence [15, 4]. The idea is that, after an individual's data is deleted, the resulting model should be in the state it would have been had the model originally been trained without the individual in question's data. The primary focus of this literature has been on achieving or approximating this condition for complex models in ways that are more computationally efficient than full retraining (see e.g.
arXiv.org Artificial Intelligence
May-30-2024
- Country:
- North America > United States (0.68)
- Genre:
- Research Report > New Finding (0.47)
- Industry:
- Information Technology > Security & Privacy (1.00)
- Technology: