The Download: inaccurate welfare algorithms, and training AI for free
The news: An algorithm funded by the World Bank to determine which families should get financial assistance in Jordan likely excludes people who should qualify, an investigation from Humans Rights Watch has found. Why it matters: The organization identified several fundamental problems with the algorithmic system that resulted in bias and inaccuracies. It ranks families applying for aid from least poor to poorest using a secret calculus that assigns weights to 57 socioeconomic indicators. Applicants say that the calculus is not reflective of reality, and oversimplifies people's economic situation. The bigger picture: AI ethics researchers are calling for more scrutiny around the increasing use of algorithms in welfare systems.
Jun-13-2023, 12:10:00 GMT
- Country:
- Asia > Middle East > Jordan (0.28)
- Industry:
- Banking & Finance (0.61)
- Law > Civil Rights & Constitutional Law (0.61)
- Technology: