Goto

Collaborating Authors

 elish


The hidden work created by artificial intelligence programs

#artificialintelligence

Artificial intelligence is often framed in terms of headline-grabbing technology and dazzling promise. But some of the workers who enable these programs -- the people who do things like code data, flag pictures, or work to integrate the programs into the workplace -- are often overlooked or undervalued. "This is a common pattern in the social studies of technology," said Madeleine Clare Elish, SM '10, a senior research scientist at Google. "A focus on new technology, the latest innovation, comes at the expense of the humans who are working to actually allow that innovation to function in the real world." Speaking at the recent EmTech Digital conference hosted by MIT Technology Review, Elish and other researchers said artificial intelligence programs often fail to account for the humans who incorporate AI systems into existing workflow, workers doing behind-the-scenes labor to make the programs run, and the people who are negatively affected by AI outcomes.


Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction (pre-print) by M. C. Elish :: SSRN

#artificialintelligence

As debates about the policy and ethical implications of AI systems grow, it will be increasingly important to accurately locate who is responsible when agency is distributed in a system and control over an action is mediated through time and space. Analyzing several high-profile accidents involving complex and automated socio-technical systems and the media coverage that surrounded them, I introduce the concept of a moral crumple zone to describe how responsibility for an action may be misattributed to a human actor who had limited control over the behavior of an automated or autonomous system. Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a highly complex and automated system may become simply a component--accidentally or intentionally--that bears the brunt of the moral and legal responsibilities when the overall system malfunctions. While the crumple zone in a car is meant to protect the human driver, the moral crumple zone protects the integrity of the technological system, at the expense of the nearest human operator. The concept is both a challenge to and an opportunity for the design and regulation of human-robot systems.


Situating Methods in the Magic of Big Data and Artificial Intelligence by M. C. Elish, ! danah ! boyd :: SSRN

#artificialintelligence

"Big Data" and "artificial intelligence" have captured the public imagination and are profoundly shaping social, economic, and political spheres. Through an interrogation of the histories, perceptions, and practices that shape these technologies, we problematize the myths that animate the supposed "magic" of these systems. In the face of an increasingly widespread blind faith in data-driven technologies, we argue for grounding machine learning-based practices and untethering them from hype and fear cycles. One path forward is to develop a rich methodological framework for addressing the strengths and weaknesses of doing data analysis. Through provocatively reimagining machine learning as computational ethnography, we invite practitioners to prioritize methodological reflection and recognize that all knowledge work is situated practice.