Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says. Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools. Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said. The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men.
Face recognition is a stark example of a technology that is being deployed faster than society and the law can adopt new norms and rules. It lets governments and private enterprise track citizens anywhere there is a camera, even if they're not carrying any devices. In general, people who are in public don't have any legal expectation of privacy and can be photographed or recorded. Because of this, the technology has the potential to be more intrusive than phone tracking, the legality of which the U.S. Supreme Court will soon decide. There are only two states, Texas and Illinois, that limit private companies' ability to track people via their faces.
The facial-recognition cameras installed near the bounce houses at the Warehouse, an after-school recreation center in Bloomington, Indiana, are aimed low enough to scan the face of every parent, teenager and toddler who walks in. The center's director, David Weil, learned earlier this year of the surveillance system from a church newsletter, and within six weeks he had bought his own, believing it promised a security breakthrough that was both affordable and cutting-edge. Since last month, the system has logged thousands of visitors' faces – alongside their names, phone numbers and other personal details – and checked them against a regularly updated blacklist of sex offenders and unwanted guests. The system's Israeli developer, Face-Six, also promotes it for use in prisons and drones. "Some parents still think it's kind of '1984,' " said Weil, whose 21-month-old granddaughter is among the scanned.
Facial-recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph. When the person in the photo is a white man, the software is right 99 percent of the time. But the darker the skin, the more errors arise -- up to nearly 35 percent for images of darker-skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender. These disparate results, calculated by Joy Buolamwini, a researcher at the Massachusetts Institute of Technology Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.