OpenOOD: Benchmarking Generalized Out-of-Distribution Detection

Yang, Jingkang, Wang, Pengyun, Zou, Dejian, Zhou, Zitang, Ding, Kunyuan, Peng, Wenxuan, Wang, Haoqi, Chen, Guangyao, Li, Bo, Sun, Yiyou, Du, Xuefeng, Zhou, Kaiyang, Zhang, Wayne, Hendrycks, Dan, Li, Yixuan, Liu, Ziwei

arXiv.org Artificial Intelligence 

Out-of-distribution (OOD) detection is vital to safety-critical machine learning applications and has thus been extensively studied, with a plethora of methods developed in the literature. However, the field currently lacks a unified, strictly formulated, and comprehensive benchmark, which often results in unfair comparisons and inconclusive results. From the problem setting perspective, OOD detection is closely related to neighboring fields including anomaly detection (AD), open set recognition (OSR), and model uncertainty, since methods developed for one domain are often applicable to each other. To help the community to improve the evaluation and advance, we build a unified, well-structured codebase called OpenOOD, which implements over 30 methods developed in relevant fields and provides a comprehensive benchmark under the recently proposed generalized OOD detection framework. With a comprehensive comparison of these methods, we are gratified that the field has progressed significantly over the past few years, where both preprocessing methods and the orthogonal post-hoc methods show strong potential. We invite readers to use our OpenOOD codebase to develop and contribute. The full experimental results are available in this table.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found