Privacy fears as MILLIONS of photos used to train facial recognition AI without users' consent

Daily Mail

Many facial recognition systems are being trained using millions of online photos uploaded by everyday people and, more often than not, the photos are being taken without users' consent, an NBC News investigation has found. In one worrying case, IBM scraped almost a million photos from unsuspecting users on Flickr to build its facial recognition database. The practice not only raises privacy concerns, but also fuels fears that the systems could one day be used to disproportionately target minorities. Many facial recognition systems are being trained using millions of online photos uploaded by everyday people and, more often than not, the photos are being taken without users' consent IBM's database, called'Diversity in Faces,' was released in January as part of the company's efforts to'advance the study of fairness and accuracy in facial recognition technology.' The database was released following a study from MIT Media Lab researcher Joy Buolamwini, which found that popular facial recognition services from Microsoft, IBM and Face vary in accuracy based on gender and race.


Your social media photos could be training facial recognition AI without your consent

Mashable

If your face has ever appeared in a photo on Flickr, it could be currently training facial recognition technology without your permission. As per a report by NBC News, IBM has been using around one million images from the image-hosting platform to train its facial recognition AI, without the permission of the people in the photos. Amazon's suggestions for facial recognition laws wouldn't govern them at all In January, IBM revealed its new "Diversity in Faces" dataset with the goal to make facial recognition systems fairer and better at identifying a diverse range of faces -- AI algorithms have had difficulty in the past recognising women and people of colour. Considering the potential uses of facial recognition technology, whether it be for hardcore surveillance, finding missing persons, detecting celebrity stalkers, social media image tagging, or unlocking your phone or house, many people might not want their face used for this type of AI training -- particularly if it involves pinpointing people by gender or race. IBM's dataset drew upon a huge collection of around 100 million Creative Commons-licensed images, referred to as the YFCC-100M dataset and released by Flickr's former owner, Yahoo, for research purposes -- there are many CC image databases used for academic research into facial recognition, or fun comparison projects.


Being able to walk around without being tracked by facial recognition could be a thing of the past

Daily Mail

Walking around without being constantly identified by AI could soon be a thing of the past, legal experts have warned. The use of facial recognition software could signal the end of civil liberties if the law doesn't change as quickly as advancements in technology, they say. Software already being trialled around the world could soon be adopted by companies and governments to constantly track you wherever you go. Shop owners are already using facial recognition to track shoplifters and could soon be sharing information across a broad network of databases, potentially globally. Previous research has found that the technology isn't always accurate, mistakenly identifying women and individuals with darker shades of skin as the wrong people.


IBM's Facial Recognition Database: Dangers of Hyperbole

#artificialintelligence

I'm recovering from the hyperventilating hyperbole in the reportage of IBM's labeling of a dataset of facial photographs and making it available to researchers to reduce bias in facial recognition. NBC News went with a headline that read: Facial recognition's'dirty little secret': Millions of online photos scraped without consent. That might merit a "pants on fire" rating if it were in the realm of political reporting. The photos were not "scraped." The NBC story linked to IBM's discussion of its work which, in turn, identified the dataset that it used.


Transgender YouTubers had their videos grabbed to train facial recognition software

#artificialintelligence

About five or six years ago, one of Karl Ricanek's students showed him a video on YouTube. It was a time lapse of a person undergoing hormone replacement therapy, or HRT, in order to transition genders. "At the time, we were working on facial recognition," Ricanek, a professor of computer science at the University of North Carolina at Wilmington, tells The Verge. He says he and his students were always trying to find ways to break the systems they worked on, and that this video seemed like a particularly tricky challenge. "We were like, 'Wow there's no way the current technology could recognize this person [after they transitioned].'"