The Download: inaccurate welfare algorithms, and training AI for free

MIT Technology Review 

The news: An algorithm funded by the World Bank to determine which families should get financial assistance in Jordan likely excludes people who should qualify, an investigation from Humans Rights Watch has found. Why it matters: The organization identified several fundamental problems with the algorithmic system that resulted in bias and inaccuracies. It ranks families applying for aid from least poor to poorest using a secret calculus that assigns weights to 57 socioeconomic indicators. Applicants say that the calculus is not reflective of reality, and oversimplifies people's economic situation. The bigger picture: AI ethics researchers are calling for more scrutiny around the increasing use of algorithms in welfare systems.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found