Federated Unlearning Made Practical: Seamless Integration via Negated Pseudo-Gradients

Mora, Alessio, Mazzocca, Carlo, Montanari, Rebecca, Bellavista, Paolo

arXiv.org Artificial Intelligence 

Abstract--The right to be forgotten is a fundamental principle of privacy-preserving regulations and extends to Machine Learning (ML) paradigms such as Federated Learning (FL). While FL enhances privacy by enabling collaborative model training without sharing private data, trained models still retain the influence of training data. Federated Unlearning (FU) methods recently proposed often rely on impractical assumptions for real-world FL deployments, such as storing client update histories or requiring access to a publicly available dataset. T o address these constraints, this paper introduces a novel method that leverages negated Pseudo-gradients Updates for Federated Unlearning (PUF). Our approach only uses standard client model updates, which are employed during regular FL rounds, and interprets them as pseudo-gradients. When a client needs to be forgotten, we apply the negation of their pseudo-gradients, appropriately scaled, to the global model. Unlike state-of-the-art mechanisms, PUF seamlessly integrates with FL workflows, incurs no additional computational and communication overhead beyond standard FL rounds, and supports concurrent unlearning requests. We extensively evaluated the proposed method on two well-known benchmark image classification datasets (CIF AR-10 and CIF AR-100) and a real-world medical imaging dataset for segmentation (ProstateMRI), using three different neural architectures: two residual networks and a vision transformer . The experimental results across various settings demonstrate that PUF achieves state-of-the-art forgetting effectiveness and recovery time, without relying on any additional assumptions. N today's digital landscape, privacy has become a major concern, as reflected by the emergence of robust regulatory frameworks worldwide [1]. The European Union (EU) has consistently emphasized the importance of protecting personal data, exemplified by the introduction of the General Data Protection Regulation (GDPR) in 2016 [2]. Most recently, in May 2024, the EU enacted Regulation 2024/1183 [3], establishing the European Digital Identity Framework that empowers individuals with fine-grained control over their information. One of the key rights of these regulations is the right to be forgotten, which allows individuals to request the deletion of their previously shared data. Similar rights are central to other major privacy laws worldwide, such as the California Consumer Privacy Act (CCP A) [4] where the right to delete grants California residents the on-demand removal of personal data held by businesses. Alessio Mora, Rebecca Montanari, and Paolo Bellavista are with the Department of Computer Science and Engineering, University of Bologna, Bologna, Italy (e-mail: {name.surname}@unibo.it).