Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers
Google and startups like Qure.ai, Aidoc, and DarwinAI are developing AI and machine learning systems that classify chest X-rays to help identify conditions like fractures, collapsed lungs, and fractures. Several hospitals including Mount Sinai have piloted computer vision algorithms that analyze scans from patients with the novel coronavirus. But research from the University of Toronto, the Vector Institute, and MIT reveals that chest X-ray datasets used to train diagnostic models exhibit imbalance, biasing them against certain gender, socioeconomic, and racial groups. Partly due to a reticence to release code, datasets, and techniques, much of the data used today to train AI algorithms for diagnosing diseases may perpetuate inequalities. A team of U.K. scientists found that almost all eye disease datasets come from patients in North America, Europe, and China, meaning eye disease-diagnosing algorithms are less certain to work well for racial groups from underrepresented countries.
Oct-21-2020, 18:45:21 GMT
- Country:
- Asia
- Europe (0.25)
- North America
- Canada > Ontario
- Toronto (0.56)
- United States
- California (0.05)
- Massachusetts (0.05)
- New York (0.05)
- Canada > Ontario
- Genre:
- Research Report (0.32)
- Industry:
- Technology: