Reimagining Anomalies: What If Anomalies Were Normal?
Liznerski, Philipp, Varshneya, Saurabh, Calikus, Ece, Fellenz, Sophie, Kloft, Marius
Deep learning-based methods have achieved a breakthrough in image anomaly detection, but their complexity introduces a considerable challenge to understanding why an instance is predicted to be anomalous. We introduce a novel explanation method that generates multiple counterfactual examples for each anomaly, capturing diverse concepts of anomalousness. A counterfactual example is a modification of the anomaly that is perceived as normal by the anomaly detector. The method provides a high-level semantic explanation of the mechanism that triggered the anomaly detector, allowing users to explore "what-if scenarios." Qualitative and quantitative analyses across various image datasets show that the method applied to state-of-the-art anomaly detectors can achieve high-quality semantic explanations of detectors.
Feb-22-2024
- Country:
- Europe
- North America > Canada (0.14)
- Genre:
- Research Report > Promising Solution (0.34)
- Industry:
- Health & Medicine (0.46)
- Transportation (0.72)
- Technology: