byz
d37eb50d868361ea729bb4147eb3c1d8-AuthorFeedback.pdf
We thank all the reviewers for their valuable comments and appreciation of the ideas and results presented in the paper. We summarize the main questions from the reviewers and address them separately below. T o Reviewer #1 Q1: Network connectivity is presumably known . . . it seems all the graphs considered are com-3 We note that the network connectivity is not assumed to be known. T o Reviewer #3 Q1: Scope of the paper/Missing related work. " and "FedNAS" are about We can add an explanation to clarify the MTL scope of the paper.
P4L: Privacy Preserving Peer-to-Peer Learning for Infrastructureless Setups
Arapakis, Ioannis, Papadopoulos, Panagiotis, Katevas, Kleomenis, Perino, Diego
Distributed (or Federated) learning enables users to train machine learning models on their very own devices, while they share only the gradients of their models usually in a differentially private way (utility loss). Although such a strategy provides better privacy guarantees than the traditional centralized approach, it requires users to blindly trust a centralized infrastructure that may also become a bottleneck with the increasing number of users. In this paper, we design and implement P4L: a privacy preserving peer-to-peer learning system for users to participate in an asynchronous, collaborative learning scheme without requiring any sort of infrastructure or relying on differential privacy. Our design uses strong cryptographic primitives to preserve both the confidentiality and utility of the shared gradients, a set of peer-to-peer mechanisms for fault tolerance and user churn, proximity and cross device communications. Extensive simulations under different network settings and ML scenarios for three real-life datasets show that P4L provides competitive performance to baselines, while it is resilient to different poisoning attacks. We implement P4L and experimental results show that the performance overhead and power consumption is minimal (less than 3mAh of discharge).
- Information Technology > Communications (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.61)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)