Evading Real-Time Person Detectors by Adversarial T-shirt

Xu, Kaidi, Zhang, Gaoyuan, Liu, Sijia, Fan, Quanfu, Sun, Mengshu, Chen, Hongge, Chen, Pin-Yu, Wang, Yanzhi, Lin, Xue

arXiv.org Artificial Intelligence 

It is known that deep neural networks (DNNs) could be vulnerable to adversarial attacks. The so-called physical adversarial examples deceive DNN-based decision makers by attaching adversarial patches to real objects. However, most of the existing works on physical adversarial attacks focus on static objects such as glass frame, stop sign and image attached to a cardboard. In this work, we proposed adversarial T-shirt, a robust physical adversarial example for evading person detectors even if it suffers from deformation due toa moving person's pose change. To the best of our knowledge, the effect of deformation is first modeled for designing physical adversarial examples with respect to non-rigid objects such as T-shirts. We show that the proposed method achieves 79% and 63% attack success rates in digital and physical worlds respectively against YOLOv2. In contrast, the state-of-the-art physical attack method to fool a person detector only achieves 27% attack success rate. Furthermore, by leveraging min-max optimization, we extend our method to the ensemble attack setting against object detectors YOLOv2 and Faster R-CNN simultaneously.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found