Pattern Recognition


Slate's Mistakes for the Week of March 18

Slate

In a March 21 Slatest, Mark Joseph Stern misstated that the April 2019 Wisconsin Supreme Court election could give Democratic justices a majority. That opportunity will not arise until the 2020 election. Due to an editing error, a March 20 Future Tense Newsletter incorrectly stated that the National Institute of Standards and Technology has been using nonconsensually obtained images to train its Facial Recognition Verification Testing program. The NIST does not develop or train facial recognition systems. It provides independent government evaluations of prototype face recognition technologies.


Being able to walk around without being tracked by facial recognition could be a thing of the past

Daily Mail

Walking around without being constantly identified by AI could soon be a thing of the past, legal experts have warned. The use of facial recognition software could signal the end of civil liberties if the law doesn't change as quickly as advancements in technology, they say. Software already being trialled around the world could soon be adopted by companies and governments to constantly track you wherever you go. Shop owners are already using facial recognition to track shoplifters and could soon be sharing information across a broad network of databases, potentially globally. Previous research has found that the technology isn't always accurate, mistakenly identifying women and individuals with darker shades of skin as the wrong people.


MoviePass founder tests app that awards free tickets to users spied on by facial recognition cameras

Daily Mail

The co-founder of MoviePass has developed a new idea to get people to the theater. Called PreShow, users would be able to earn free movie tickets if they agree to watch advertisements for blocks of time between 15 and 20 minutes. There's also another, creepier, twist to the proposed app: It will only unlock with facial recognition and it also tracks your gaze using facial recognition technology to make sure you're actually watching the ads, according to CNET. A MoviePass co-founder has developed a new idea to get people to the theater. PreShow is being developed by MoviePass co-founder Stacy Spikes, who stepped down as CEO of the beleaguered ticketing company in 2016.


What's in a face?

MIT News

Our brains are incredibly good at processing faces, and even have specific regions specialized for this function. But what face dimensions are we observing? Do we observe general properties first, then look at the details? Or are dimensions such as gender or other identity details decoded interdependently? In a study published in Nature Communications, neuroscientists at the McGovern Institute for Brain Research measured the response of the brain to faces in real-time, and found that the brain first decodes properties such as gender and age before drilling down to the specific identity of the face itself.


In the Face of Danger, We're Turning to Surveillance

WIRED

When school began in Lockport, New York, this past fall, the halls were lined not just with posters and lockers, but cameras. Over the summer, a brand new $4 million facial recognition system was installed by the school district in the town's eight schools from elementary to high school. The system scans the faces of students as they roam the halls, looking for faces that have been uploaded and flagged as dangerous. "Any way that we can improve safety and security in schools is always money well spent," David Lowry, president of the Lockport Education Association, told the Lockport Union-Sun & Journal. Rose Eveleth is an Ideas contributor at WIRED and the creator and host of Flash Forward, a podcast about possible (and not so possible) futures.


US officials train facial recognition tech with photos of dead people and immigrants, report claims

Daily Mail

A unit of the U.S. Department of Commerce has been using photos of immigrants, abused children and dead people to train their facial recognition systems, a worrying new report has detailed. The National Institute of Standards and Technology (NIST) oversees a database, called the Facial Recognition Verification Testing program, that'depends' on these types of controversial images, according to Slate. Scientists from Dartmouth College, the University of Washington and Arizona State University discovered the practice and laid out their findings in new research set to be reviewed for publication later this year. A unit of the U.S. Department of Commerce has been using photos of immigrants, abused children and dead people to train their facial recognition systems, a new report has detailed. The Facial Recognition Verification Testing program was first established in 2017 as a way for companies, academic researchers and designers to evaluate their facial recognition technologies.


The Government Uses Images of Abused Children and Dead People to Test Facial Recognition Tech

Slate

If you thought IBM using "quietly scraped" Flickr images to train facial recognition systems was bad, it gets worse. Our research, which will be reviewed for publication this summer, indicates that the U.S. government, researchers, and corporations have used images of immigrants, abused children, and dead people to test their facial recognition systems, all without consent. The very group the U.S. government has tasked with regulating the facial recognition industry is perhaps the worst offender when it comes to using images sourced without the knowledge of the people in the photographs. The National Institute of Standards and Technology, a part of the U.S. Department of Commerce, maintains the Facial Recognition Verification Testing program, the gold standard test for facial recognition technology. This program helps software companies, researchers, and designers evaluate the accuracy of their facial recognition programs by running their software through a series of challenges against large groups of images (data sets) that contain faces from various angles and in various lighting conditions.


Deep learning and the future of facial recognition - Kognitio

#artificialintelligence

Deep learning enables machines to learn and solve complex problems using algorithms inspired by the human brain without any human intervention. Deep learning algorithms need data to learn, and lots of it! But that's no problem because we generate approximately 2.6 quintillion bytes a day1. Facial recognition uses images captured of an individual's face from photos or videos. The distances between the eyes, nose, mouth and jaw are measured.


Facial Recognition's 'Dirty Little Secret': Millions of Online Photos Scraped Without Consent

#artificialintelligence

Legal experts warn people's online photos are being used without permission to power facial-recognition technology that could eventually be used for surveillance. Legal experts warn people's online photos are being used without permission to power facial-recognition technology that could eventually be used for surveillance. Said New York University School of Law's Jason Schultz, "This is the dirty little secret of [artificial intelligence] training sets. Researchers often just grab whatever images are available in the wild." IBM recently issued a set of nearly 1 million photos culled from the image-hosting site Flickr, and programmed to describe subjects' appearance, allegedly to help reduce bias in facial recognition; although IBM said Flickr users can opt out of the database, deleting photos is almost impossible.


IBM's photo-scraping scandal shows what a weird bubble AI researchers live in

#artificialintelligence

On Tuesday, NBC published a story with a gripping headline: "Facial recognition's'dirty little secret': Millions of online photos scraped without consent." I linked to it in our last Algorithm issue, but it's worth a revisit today. The story highlights a recent data set released by IBM with 1 million pictures of faces, intended to help develop fairer face recognition algorithms. It turns out, NBC found, that those faces were scraped directly from the online photo-hosting site Flickr, without the permission of the subjects or photographers. For some of you, this practice will immediately feel creepy and weird.