Scientists voice concerns, call for transparency and reproducibility in AI research
In an article published in Nature on October 14, 2020, scientists at Princess Margaret Cancer Centre, University of Toronto, Stanford University, Johns Hopkins, Harvard School of Public Health, Massachusetts Institute of Technology, and others, challenge scientific journals to hold computational researchers to higher standards of transparency, and call for their colleagues to share their code, models and computational environments in publications. "Scientific progress depends on the ability of researchers to scrutinize the results of a study and reproduce the main finding to learn from," says Dr. Benjamin Haibe-Kains, Senior Scientist at Princess Margaret Cancer Centre and first author of the article. "But in computational research, it's not yet a widespread criterion for the details of an AI study to be fully accessible. This is detrimental to our progress." The authors voiced their concern about the lack of transparency and reproducibility in AI research after a Google Health study by McKinney et al., published in a prominent scientific journal in January 2020, claimed an artificial intelligence (AI) system could outperform human radiologists in both robustness and speed for breast cancer screening.
Dec-12-2020, 01:07:03 GMT
- Country:
- North America
- Canada > Ontario
- Toronto (0.61)
- United States > Massachusetts (0.26)
- Canada > Ontario
- North America
- Genre:
- Research Report (0.95)
- Industry:
- Health & Medicine > Therapeutic Area > Oncology (1.00)
- Technology: