Class Unbiasing for Generalization in Medical Diagnosis
Zuo, Lishi, Mak, Man-Wai, Yi, Lu, Tu, Youzhi
–arXiv.org Artificial Intelligence
Lishi Zuo, Man-Wai Mak, Lu Yi, and Y ouzhi Tu Dept. of Electrical and Electronic Engineering, Hong Kong Polytechnic University, Hong Kong SAR, China E-mail: lishi.zuo@connect.polyu.hk, Abstract --Medical diagnosis might fail due to bias. In this work, we identified class-feature bias, which refers to models' potential reliance on features that are strongly correlated with only a subset of classes, leading to biased performance and poor generalization on other classes. We aim to train a class-unbiased model (Cls-unbias) that mitigates both class imbalance and class-feature bias simultaneously. Specifically, we propose a class-wise inequality loss which promotes equal contributions of classification loss from positive-class and negative-class samples. We propose to optimize a class-wise group distributionally robust optimization objective--a class-weighted training objective that up-weights underperforming classes--to enhance the effectiveness of the inequality loss under class imbalance. Through synthetic and real-world datasets, we empirically demonstrate that class-feature bias can negatively impact model performance. Our proposed method effectively mitigates both class-feature bias and class imbalance, thereby improving the model's generalization ability.
arXiv.org Artificial Intelligence
Sep-3-2025
- Country:
- Asia > China
- Gansu Province > Lanzhou (0.04)
- Hong Kong (0.44)
- Asia > China
- Genre:
- Research Report > New Finding (0.93)
- Industry:
- Health & Medicine
- Diagnostic Medicine > Imaging (0.46)
- Therapeutic Area > Oncology (0.46)
- Health & Medicine
- Technology: