Goto

Collaborating Authors

 flbuff


Buffer is All You Need: Defending Federated Learning against Backdoor Attacks under Non-iids via Buffering

Lyu, Xingyu, Wang, Ning, Xiao, Yang, Li, Shixiong, Li, Tao, Chen, Danjue, Chen, Yimin

arXiv.org Artificial Intelligence

Buffer is All Y ou Need: Defending Federated Learning against Backdoor Attacks under Non-iids via Buffering Xingyu Lyu, Ning Wang, Y ang Xiao, Shixiong Li, Tao Li, Danjue Chen, Yimin Chen Miner School of Computer and Information Sciences, University of Massachusetts Lowell, USA, Department of Computer Science and Engineering, University of South Florida, USA, Department of Computer Science, University of Kentucky, Department of Computer and Information Technology, Purdue University, USA, Department of Civil, Construction, and Environmental Engineering, North Carolina State University, USA {xingyu_lyu, shixiong_li, ian_chen}@uml.edu, Abstract --Federated Learning (FL) is a popular paradigm enabling clients to jointly train a global model without sharing raw data. However, FL is known to be vulnerable towards backdoor attacks due to its distributed nature. Here we propose FLBuff for tackling backdoor attacks even under non-iids. The main challenge for such defenses is that non-iids bring benign and malicious updates closer, hence harder to separate. FLBuff is inspired by our insight that non-iids can be modeled as omni-directional expansion in representation space while backdoor attacks as uni-directional. Comprehensive evaluations demonstrate that FLBuff consistently outperforms state-of-the-art defenses.