Goto

Collaborating Authors

How Cities Are Reining in Out-of-Control Policing Tech

Slate

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society. Since the turn of the 21st century, local law enforcement departments have stocked up on unprecedentedly invasive surveillance tech for monitoring their communities, with little to no oversight. But a counterrevolution is brewing. On May 1, the Oakland, California, City Council unanimously adopted the Surveillance and Community Safety Ordinance, the nation's strongest law governing how police acquire and use surveillance technologies. While Oakland may be at the head of the pack here, it isn't alone: In the past few years, local governments across the country have become laboratories for developing new approaches to reining in policing tech, and the solutions they've been cooking up are starting to spread to more cities and towns that are concerned about overreaching or secret police surveillance of their communities.


Australia needs to face up to the dangers of facial recognition technology

#artificialintelligence

In the 20 years of the "war on terror" Australia has led from the front in expanding powers for law enforcement and ramping up surveillance at the expense of public rights and freedoms. Among the seemingly endless barrage of national security legislation and surveillance that creeps into every aspect of our personal lives, more and more of our public spaces have been smothered by surveillance cameras and facial recognition technology. Corporations large and small, towns and cities, federal and state government departments and agencies have deployed these systems, snooping on us all wherever we go without any of us getting a say. State and federal law enforcement officers are accessing these technologies without any oversight. As anti-police protests spread around the world, tools and processes that exacerbate racist bias – and the wasteful spending and abuses of power that comes with it –within law enforcement and judicial systems have fallen under renewed scrutiny. Once again, Australia is lagging behind the debate.


An Algorithmic Equity Toolkit for Technology Audits by Community Advocates and Activists

arXiv.org Artificial Intelligence

A wave of recent scholarship documenting the discriminatory harms of algorithmic systems has spurred widespread interest in algorithmic accountability and regulation. Yet effective accountability and regulation is stymied by a persistent lack of resources supporting public understanding of algorithms and artificial intelligence. Through interactions with a US-based civil rights organization and their coalition of community organizations, we identify a need for (i) heuristics that aid stakeholders in distinguishing between types of analytic and information systems in lay language, and (ii) risk assessment tools for such systems that begin by making algorithms more legible. The present work delivers a toolkit to achieve these aims. This paper both presents the Algorithmic Equity Toolkit (AEKit) Equity as an artifact, and details how our participatory process shaped its design. Our work fits within human-computer interaction scholarship as a demonstration of the value of HCI methods and approaches to problems in the area of algorithmic transparency and accountability.


Odd Numbers -- Real Life

#artificialintelligence

Algorithms increasingly govern our social world, transforming data into scores or rankings that decide who gets credit, jobs, dates, policing, and much more. The field of "algorithmic accountability" has arisen to highlight the problems with such methods of classifying people, and it has great promise: Cutting-edge work in critical algorithm studies applies social theory to current events; law and policy experts seem to publish new articles daily on how artificial intelligence shapes our lives, and a growing community of researchers has developed a field known as "Fairness, Accuracy, and Transparency in Machine Learning." The social scientists, attorneys, and computer scientists promoting algorithmic accountability aspire to advance knowledge and promote justice. But what should such "accountability" more specifically consist of? At a two-day, interdisciplinary roundtable on AI ethics I recently attended, such questions featured prominently, and humanists, policy experts, and lawyers engaged in a free-wheeling discussion about topics ranging from robot arms races to computationally planned economies.


California passed a law boosting police transparency on cellphone surveillance. Here's why it's not working

Los Angeles Times

California passed a law to require police to be more transparent about cellphone surveillance. California passed a law to increase transparency of cellphone surveillance. A year later, it is still difficult to gauge how law agencies are gathering information. California passed a law to increase transparency of cellphone surveillance. A year later, it is still difficult to gauge how law agencies are gathering information.