Detecting and Monitoring Bias for Subgroups in Breast Cancer Detection AI
Kundu, Amit Kumar, Doo, Florence X., Patil, Vaishnavi, Varshney, Amitabh, Jaja, Joseph
–arXiv.org Artificial Intelligence
Early breast cancer detection (BCD) through mammography screening continues to be a major focus in radiology as it plays a critical role in reducing mortality rates (Coleman (2017); Ginsburg et al. (2020)). Although artificial intelligence (AI) models can help radiologists to evaluate mammograms (Sahu et al. (2023); Evans et al. (2013); Maxwell (1999)), training such models face the challenge of limited datasets that may not fully represent all subgroups or cover variations in data distributions. Historically, certain racial groups face barriers to healthcare access because of many socio-economic factors (Azin et al. (2023); Hershman et al. (2005); Hussain-Gambles et al. (2004)). This lack of access can result in datasets that do not adequately represent these groups, potentially cause AI models to show biases for these groups. Even with seemingly balanced datasets, subtle biases may persist in the collected data due to systemic inequalities in the quality of healthcare (Obermeyer et al. (2019)). Among these groups, African American patients are often underrepresented in both breast imaging and broader healthcare datasets (Yedjou et al. (2019); Newman and Kaljee (2017)).
arXiv.org Artificial Intelligence
Feb-14-2025
- Country:
- Europe > Netherlands (0.28)
- North America > United States (0.28)
- Genre:
- Research Report
- Experimental Study (1.00)
- New Finding (1.00)
- Research Report
- Industry:
- Health & Medicine
- Diagnostic Medicine > Imaging (1.00)
- Therapeutic Area > Oncology
- Breast Cancer (0.94)
- Health & Medicine
- Technology: