Inclusive, Differentially Private Federated Learning for Clinical Data
Parampottupadam, Santhosh, Coşğun, Melih, Pati, Sarthak, Zenk, Maximilian, Roy, Saikat, Bounias, Dimitrios, Hamm, Benjamin, Sav, Sinem, Floca, Ralf, Maier-Hein, Klaus
–arXiv.org Artificial Intelligence
Federated Learning (FL) offers a promising approach for training clinical AI models without centralizing sensitive patient data. However, its real-world adoption is hindered by challenges related to privacy, resource constraints, and compliance. Existing Differential Privacy (DP) approaches often apply uniform noise, which disproportionately degrades model performance, even among well-compliant institutions. In this work, we propose a novel compliance-aware FL framework that enhances DP by adaptively adjusting noise based on quantifiable client compliance scores. Additionally, we introduce a compliance scoring tool based on key healthcare and security standards to promote secure, inclusive, and equitable participation across diverse clinical settings. Extensive experiments on public datasets demonstrate that integrating under-resourced, less compliant clinics with highly regulated institutions yields accuracy improvements of up to 15% over traditional FL. This work advances FL by balancing privacy, compliance, and performance, making it a viable solution for real-world clinical workflows in global healthcare.
arXiv.org Artificial Intelligence
Oct-14-2025
- Country:
- Asia (0.04)
- Europe > Germany
- Baden-Württemberg > Karlsruhe Region > Heidelberg (0.04)
- North America > United States
- California > San Francisco County
- San Francisco (0.14)
- Indiana (0.04)
- California > San Francisco County
- Oceania > Australia (0.04)
- South America (0.04)
- Genre:
- Research Report > Experimental Study (0.40)
- Industry:
- Technology: