Inside a Misfiring Government Data Machine
Last week, WIRED published a series of in-depth, data-driven stories about a problematic algorithm the Dutch city of Rotterdam deployed with the aim of rooting out benefits fraud. In partnership with Lighthouse Reports, a European organization that specializes in investigative journalism, WIRED gained access to the inner workings of the algorithm under freedom-of-information laws and explored how it evaluates who is most likely to commit fraud. We found that the algorithm discriminates based on ethnicity and gender--unfairly giving women and minorities higher risk scores, which can lead to investigations that cause significant damage to claimants' personal lives. An interactive article digs into the guts of the algorithm, taking you through two hypothetical examples to show that while race and gender are not among the factors fed into the algorithm, other data, such as a person's Dutch language proficiency, can act as a proxy that enables discrimination. The project shows how algorithms designed to make governments more efficient--and which are often heralded as fairer and more data-driven--can covertly amplify societal biases.
Mar-16-2023, 16:00:00 GMT
- Country:
- Europe > Netherlands
- South Holland > Rotterdam (0.30)
- North America > United States (0.06)
- Europe > Netherlands
- Industry:
- Technology: