Revisiting Adversarial Patches for Designing Camera-Agnostic Attacks against Person Detection Hui Wei 1 Zhixiang Wang
–Neural Information Processing Systems
Physical adversarial attacks can deceive deep neural networks (DNNs), leading to erroneous predictions in real-world scenarios. To uncover potential security risks, attacking the safety-critical task of person detection has garnered significant attention. However, we observe that existing attack methods overlook the pivotal role of the camera, involving capturing real-world scenes and converting them into digital images, in the physical adversarial attack workflow. This oversight leads to instability and challenges in reproducing these attacks. In this work, we revisit patch-based attacks against person detectors and introduce a camera-agnostic physical adversarial attack to mitigate this limitation. Specifically, we construct a differentiable camera Image Signal Processing (ISP) proxy network to compensate for the physical-to-digital transition gap.
Neural Information Processing Systems
May-28-2025, 11:23:22 GMT
- Country:
- Asia > China (0.14)
- Europe > Germany (0.14)
- North America
- Canada (0.14)
- United States (0.14)
- Genre:
- Research Report > Experimental Study (0.93)
- Industry:
- Information Technology > Security & Privacy (1.00)
- Technology: