Review for NeurIPS paper: Election Coding for Distributed Learning: Protecting SignSGD against Byzantine Attacks

Neural Information Processing Systems 

Summary and Contributions: This paper addresses the problem of designing first-order optimization methods that are both communication efficient and robust to byzantine workers. In particular, the paper focuses on an existing variant of SignSGD, namely SignSGD with majority voting (SignSGD-MV), which is already communication efficient by design. The paper proposes a new coding theoretic approach to make SignSGD-MV robust to byzantine workers. In a regular SignSGD-MV method, each of the n workers computes a gradient estimate based on the data partition assigned to it and sends a sign of the gradient estimate to the master node. The master node takes the coordinate wise majority of the signed gradient estimates received from all the workers to obtain the final signed gradient estimate.