Goto

Collaborating Authors

 fedpurin


FedPURIN: Programmed Update and Reduced INformation for Sparse Personalized Federated Learning

Xie, Lunchen, He, Zehua, Shi, Qingjiang

arXiv.org Artificial Intelligence

Personalized Federated Learning (PFL) has emerged as a critical research frontier addressing data heterogeneity issue across distributed clients. Novel model architectures and collaboration mechanisms are engineered to accommodate statistical disparities while producing client-specific models. Parameter decoupling represents a promising paradigm for maintaining model performance in PFL frameworks. However, the communication efficiency of many existing methods remains suboptimal, sustaining substantial communication burdens that impede practical deployment. To bridge this gap, we propose Federated Learning with Programmed Update and Reduced INformation (FedPURIN), a novel framework that strategically identifies critical parameters for transmission through an integer programming formulation. This mathematically grounded strategy is seamlessly integrated into a sparse aggregation scheme, achieving a significant communication reduction while preserving the efficacy. Comprehensive evaluations on standard image classification benchmarks under varied non-IID conditions demonstrate competitive performance relative to state-of-the-art methods, coupled with quantifiable communication reduction through sparse aggregation. The framework establishes a new paradigm for communication-efficient PFL, particularly advantageous for edge intelligence systems operating with heterogeneous data sources. Introduction Federated learning (FL), as a powerful distributed machine learning scheme, has been well studied to handle the growing trend towards harnessing abundant data on ubiquitous edge devices [1]. This framework has been successfully applied in various domains, including computer vision [2, 3], healthcare [4, 5], finance [6, 7], and ubiquitous IoT applications [8, 9, 10].