Goto

Collaborating Authors

Amazon Announces 1-Year Moratorium On Police Use Of Its Facial-Recognition Technology

NPR Technology

When Amazon introduced its facial recognition technology to law enforcement agencies, the tech giant likened it to magic. Now Amazon is telling police, stop using it for the next year. It's a move that signals the impact protests over police brutality are having on the tech industry. And we should note Amazon is a financial supporter of NPR. BOBBY ALLYN, BYLINE: Amazon for years has offered a service called Rekognition to police departments.


Biased AI Programs Could Cause Discrimination

#artificialintelligence

Joy Buolamwini was a graduate student at MIT a few years ago when she was working on an art and science project called the Aspire Mirror. The set up was supposed to use readily available facial recognition software to project images onto people's faces. But the software couldn't identify African-American Buolamwini's own face--unless she put on a white mask. She tells the story in more detail in a TED talk. As she encountered other examples of what's become known as algorithmic bias, Buolamwini decided to conduct a more rigorous review.


California could become first to limit facial recognition technology; police aren't happy

USATODAY - Tech Top Stories

San Francisco supervisors approved a ban on police using facial recognition technology, making it the first city in the U.S. with such a restriction. SAN FRANCISCO – A routine traffic stop goes dangerously awry when a police officer's body camera uses its built-in facial recognition software to misidentify a motorist as a convicted felon. At best, lawsuits are launched. That imaginary scenario is what some California lawmakers are trying to avoid by supporting Assembly Bill 1215, the Body Camera Accountability Act, which would ban the use of facial recognition software in police body cams – a national first if it passes a Senate vote this summer and is signed by Gov. Gavin Newsom. State law enforcement officials here do not now employ the technology to scan those in the line of sight of officers.


Ending Racial Biases in Face Recognition AI – Kairos – Medium

#artificialintelligence

This resonates with me very personally as a minority founder in the face recognition space. So deeply in fact, that I actually wrote about my thoughts in an October 2016 article titled "Kairos' Commitment to Your Privacy and Facial Recognition Regulations" wherein I acknowledged the impact of the problem, and expressed Kairos' position on the importance of rectification.


Joy Buolamwini: How Does Facial Recognition Software See Skin Color?

NPR Technology

Facial analysis technology is often unable to recognize dark skin tones. Joy Buolamwini says this bias can lead to detrimental results -- and she urges her colleagues to create more inclusive code. As a "poet of code", computer scientist Joy Buolamwini founded the Algorithmic Justice League to fight inequality in computation. Her graduate research at the MIT Media Lab focuses on algorithmic and coded bias in Machine Learning. Buolamwini is a Fulbright Fellow, an Astronaut Scholar, a Rhodes Scholar, and a Google Anita Borg Scholar.