FedPoP: Federated Learning Meets Proof of Participation

İşler, Devriş, van Kempen, Elina, Hwang, Seoyeon, Laoutaris, Nikolaos

arXiv.org Artificial Intelligence 

Abstract--Federated learning (FL) offers privacy preserving, distributed machine learning, allowing clients to contribute to a global model without revealing their local data. As models increasingly serve as monetizable digital assets, the ability to prove participation in their training becomes essential for establishing ownership. In this paper, we address this emerging need by introducing FedPoP, a novel FL framework that allows non-linkable proof of participation while preserving client anonymity and privacy without requiring either extensive computations or a public ledger . FedPoP is designed to seamlessly integrate with existing secure aggregation protocols to ensure compatibility with real-world FL deployments. We provide a proof of concept implementation and an empirical evaluation under realistic client dropouts. In our prototype, FedPoP introduces 0.97 seconds of per-round overhead atop securely aggregated FL and enables a client to prove its participation/contribution to a model held by a third party in 0.0612 seconds. These results indicate FedPoP is practical for real-world deployments that require auditable participation without sacrificing privacy. Federated learning (FL) [1] has become one of the innovative distributed machine learning structures wherein private data holders (a.k.a. The most common FL setting involves three parties: a server who initiates a model and aggregates training data (local models) from clients, a large number of clients who collaboratively train the model, and a service provider who deploys the model to provide services to its users. In a nutshell, an FL system consists of iterative aggregation rounds where 1) the server sends global model parameters to clients; 2) each client trains the model using its own private data and transmits updated parameters to the server; and 3) the server aggregates the updated parameters sent by the clients into a new global model using an aggregation procedure (e.g., FedAvg [1], FedQV [2]). The final global model is delivered to the service provider when the training is completed.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found