Adversarial Attacks for Drift Detection
Hinder, Fabian, Vaquet, Valerie, Hammer, Barbara
Data from the real world is often subject to continuous changes known as concept drift [1, 2, 3]. Such can be caused by seasonal changes, changed demands, aging of sensors, etc. Concept drift not only poses a problem for maintaining high performance in learning models [2, 3] but also plays a crucial role in system monitoring [1]. In the latter case, the detection of concept drift is crucial as it enables the detection of anomalous behavior. Examples include machine malfunctions or failures, network security, environmental changes, and critical infrastructures. This is done by detecting irregular shifts [4, 1, 5]. In these contexts, the ability to robustly detect drift is essential. In addition to problems such as noise and sampling error, which challenge all statistical methods, drift detection faces a special kind of difficulty when the drift follows certain patterns that evade detection. In this work, we study those specific drifts that we will refer to as "drift adversarials". Similar to adversarial attacks, drift adversarials exploit weaknesses in the detection methods, and thus allow significant concept drift to occur without triggering alarms posing major issues for monitoring systems.
Nov-25-2024
- Genre:
- Research Report (0.82)
- Industry:
- Government > Military (0.71)
- Information Technology > Security & Privacy (1.00)
- Technology: