Scionis, Luca
HO-FMN: Hyperparameter Optimization for Fast Minimum-Norm Attacks
Mura, Raffaele, Floris, Giuseppe, Scionis, Luca, Piras, Giorgio, Pintor, Maura, Demontis, Ambra, Giacinto, Giorgio, Biggio, Battista, Roli, Fabio
Gradient-based attacks are a primary tool to evaluate robustness of machine-learning models. However, many attacks tend to provide overly-optimistic evaluations as they use fixed loss functions, optimizers, step-size schedulers, and default hyperparameters. In this work, we tackle these limitations by proposing a parametric variation of the well-known fast minimum-norm attack algorithm, whose loss, optimizer, step-size scheduler, and hyperparameters can be dynamically adjusted. We re-evaluate 12 robust models, showing that our attack finds smaller adversarial perturbations without requiring any additional tuning. This also enables reporting adversarial robustness as a function of the perturbation budget, providing a more complete evaluation than that offered by fixed-budget attacks, while remaining efficient. We release our open-source code at https://github.com/pralab/HO-FMN.
Improving Fast Minimum-Norm Attacks with Hyperparameter Optimization
Floris, Giuseppe, Mura, Raffaele, Scionis, Luca, Piras, Giorgio, Pintor, Maura, Demontis, Ambra, Biggio, Battista
Evaluating the adversarial robustness of machine-learning models using gradient-based attacks is challenging. In this work, we show that hyperparameter optimization can improve fast minimum-norm attacks by automating the selection of the loss function, the optimizer, and the step-size scheduler, along with the corresponding hyperparameters. Our extensive evaluation involving several robust models demonstrates the improved efficacy of fast minimum-norm attacks when hyped up with hyperparameter optimization. We release our open-source code at https://github.com/pralab/HO-FMN.