Law enforcement agencies like the New Orleans Police Department are adopting artificial-intelligence based systems to analyze surveillance footage. WSJ's Jason Bellini gets a demonstration of the tracking technology and hears why some think it's a game changer, while for others it's raising concerns around privacy and potential bias. Photo: Drew Evans/The Wall Street Journal Don't miss a WSJ video, subscribe here: http://bit.ly/14Q81Xy
In May of 2010, prompted by a series of high-profile scandals, the mayor of New Orleans asked the US Department of Justice to investigate the city police department (NOPD). Ten months later, the DOJ offered its blistering analysis: during the period of its review from 2005 onwards, the NOPD had repeatedly violated constitutional and federal law. It used excessive force, and disproportionately against black residents; targeted racial minorities, non-native English speakers, and LGBTQ individuals; and failed to address violence against women. The problems, said assistant attorney general Thomas Perez at the time, were "serious, wide-ranging, systemic and deeply rooted within the culture of the department." Despite the disturbing findings, the city entered a secret partnership only a year later with data-mining firm Palantir to deploy a predictive policing system.
It's no secret by now that artificial intelligence has a white guy problem. One could say the same of almost any industry, but the tech world is singular in rapidly shaping the future. As has been widely publicized, the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating the imbalances and injustices that exist in the real world. There was the case of black people being classified as gorillas; the computer system that rejected an Asian man's passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime--giving rise to accusations of profiling. Earlier this year, the release of Google's Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans.