Review for NeurIPS paper: Watch out! Motion is Blurring the Vision of Your Deep Neural Networks

Neural Information Processing Systems 

This paper presents a novel adversarial attack method based on motion blur. The method can generate visually natural motion-blurred images that can fool DNNs for visual recognition. The paper is well written, and the proposed methods are convincing. One reviewer is convinced on the goodness of the paper and suggest a clear acceptance. A second and a third ones consider the paper above the acceptance threshold, being the problem very interesting and the approach clear.