Suspicion Machines

#artificialintelligence 

The researchers found the algorithm used by Rotterdam to investigate some of its 30,000 welfare recipients discriminates based on ethnicity, age, gender, and parenthood. Governments all over the world are experimenting with predictive algorithms in ways that are largely invisible to the public. What limited reporting there has been on this topic has largely focused on predictive policing and risk assessments in criminal justice systems. But there is an area where even more far-reaching experiments are underway on vulnerable populations with almost no scrutiny. Fraud detection systems are widely deployed in welfare states ranging from complex machine learning models to crude spreadsheets.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found