Goto

Collaborating Authors

 geoadex


Appendix

Neural Information Processing Systems

Details regarding the datasets used in the experiments are included in Table 2. For Yang et al. [2020], we progressively doubled the number of regions searched which is the only adjustable hyperparameter. To make this figure, we run all the experiments (all attacks, datasets, and choices of hyperparameters)onaserverwith40coresofIntel(R)Xeon(R)Gold6230CPU@2.10GHz. This outcome is seemingly perplexing than the previous one. We explain it for different values ofm, namely the small-mandthelarge-mregions.





Adversarial Examples for $k$-Nearest Neighbor Classifiers Based on Higher-Order Voronoi Diagrams

Sitawarin, Chawin, Kornaropoulos, Evgenios M., Song, Dawn, Wagner, David

arXiv.org Machine Learning

Adversarial examples are a widely studied phenomenon in machine learning models. While most of the attention has been focused on neural networks, other practical models also suffer from this issue. In this work, we propose an algorithm for evaluating the adversarial robustness of $k$-nearest neighbor classification, i.e., finding a minimum-norm adversarial example. Diverging from previous proposals, we take a geometric approach by performing a search that expands outwards from a given input point. On a high level, the search radius expands to the nearby Voronoi cells until we find a cell that classifies differently from the input point. To scale the algorithm to a large $k$, we introduce approximation steps that find perturbations with smaller norm, compared to the baselines, in a variety of datasets. Furthermore, we analyze the structural properties of a dataset where our approach outperforms the competition.