Microsoft has discreetly pulled a facial recognition database from its site that contained 10 million images of some 100,000 people. The internet giant took down the database after a Financial Times investigation revealed that the database has been used by companies and military researchers to train facial recognition systems around the world. The public dataset, called'MS Celeb,' included images of'celebrities' pulled from the internet, but also contained photos of'arguably private individuals,' often without their knowledge or consent, the FT found. Microsoft, which referred to MS Celeb as the largest publicly available facial recognition data set in the world, said the database was meant for use by academic researchers. The images were harvested from the web under protection of the Creative Commons license, which allows for reuse of images for academic and educational purposes.
A unit of the U.S. Department of Commerce has been using photos of immigrants, abused children and dead people to train their facial recognition systems, a worrying new report has detailed. The National Institute of Standards and Technology (NIST) oversees a database, called the Facial Recognition Verification Testing program, that'depends' on these types of controversial images, according to Slate. Scientists from Dartmouth College, the University of Washington and Arizona State University discovered the practice and laid out their findings in new research set to be reviewed for publication later this year. A unit of the U.S. Department of Commerce has been using photos of immigrants, abused children and dead people to train their facial recognition systems, a new report has detailed. The Facial Recognition Verification Testing program was first established in 2017 as a way for companies, academic researchers and designers to evaluate their facial recognition technologies.
Walking around without being constantly identified by AI could soon be a thing of the past, legal experts have warned. The use of facial recognition software could signal the end of civil liberties if the law doesn't change as quickly as advancements in technology, they say. Software already being trialled around the world could soon be adopted by companies and governments to constantly track you wherever you go. Shop owners are already using facial recognition to track shoplifters and could soon be sharing information across a broad network of databases, potentially globally. Previous research has found that the technology isn't always accurate, mistakenly identifying women and individuals with darker shades of skin as the wrong people.
Many facial recognition systems are being trained using millions of online photos uploaded by everyday people and, more often than not, the photos are being taken without users' consent, an NBC News investigation has found. In one worrying case, IBM scraped almost a million photos from unsuspecting users on Flickr to build its facial recognition database. The practice not only raises privacy concerns, but also fuels fears that the systems could one day be used to disproportionately target minorities. Many facial recognition systems are being trained using millions of online photos uploaded by everyday people and, more often than not, the photos are being taken without users' consent IBM's database, called'Diversity in Faces,' was released in January as part of the company's efforts to'advance the study of fairness and accuracy in facial recognition technology.' The database was released following a study from MIT Media Lab researcher Joy Buolamwini, which found that popular facial recognition services from Microsoft, IBM and Face vary in accuracy based on gender and race.
California lawmakers have passed a bill that bans law enforcement from using facial recognition technology gathered by body cameras – in a bid to end privacy abuse. The bill, signed by Governor Gavin Newsom, will go into effect in 2020 and last for three years. The motion also prohibits cops from using biometric surveillance including other forms of identification that can be capture from body camera videos. California lawmakers have passed a bill that bans law enforcement from using facial recognition technology gathered by body cameras – in a bid to end privacy abuse. The bill is first of its kind in the US and recognizes that'the use of facial recognition and other biometric surveillance is the functional equivalent of requiring every person to show a personal photo identification card at all times in violation of recognized constitutional rights.