Detecting fake face images created by both humans and machines

#artificialintelligence

Researchers at the State University of New York in Korea have recently explored new ways to detect both machine and human-created fake images of faces. In their paper, published in ACM Digital Library, the researchers used ensemble methods to detect images created by generative adversarial networks (GANs) and employed pre-processing techniques to improve the detection of images created by humans using Photoshop. Over the past few years, significant advancements in image processing and machine learning have enabled the generation of fake, yet highly realistic, images. However, these images could also be used to create fake identities, make fake news more convincing, bypass image detection algorithms, or fool image recognition tools. "Fake face images have been a topic of research for quite some time now, but studies have mainly focused on photos made by humans, using Photoshop tools," Shahroz Tariq, one of the researchers who carried out the study told Tech Xplore.


Amazon should stop selling facial recognition software to police, ACLU and other rights groups say

USATODAY - Tech Top Stories

An image from the product page of Amazon's Rekognition service, which provides image and video facial and item recognition and analysis. SAN FRANCISCO – Two years ago, Amazon built a facial and image recognition product that allows customers to cheaply and quickly search a database of images and look for matches. One of the groups it targeted as potential users of this service was law enforcement. At least two signed on: the Washington County Sheriff's Office outside of Portland, Ore., and the Orlando Police Department in Florida. Now the ACLU and civil rights groups are demanding that Amazon stop selling the software tool, called Rekognition, to police and other government entities because they fear it could be used to unfairly target protesters, immigrants and any person just going about their daily business.


Amazon needs to come clean about racial bias in its algorithms

#artificialintelligence

Yesterday, Amazon's quiet Rekognition program became very public, as new documents obtained by the ACLU of Northern California showed the system partnering with the city of Orlando and police camera vendors like Motorola Solutions for an aggressive new real-time facial recognition service. Amazon insists that the service is a simple object-recognition tool and will only be used for legal purposes. But even if we take the company at its word, the project raises serious concerns, particularly around racial bias. Facial recognition systems have long struggled with higher error rates for women and people of color -- error rates that can translate directly into more stops and arrests for marginalized groups. And while some companies have responded with public bias testing, Amazon hasn't shared any data on the issue, if it's collected data at all.


Amazon's 'Rekognition' software is being used by police departments

Daily Mail - Science & tech

Amazon is drawing the ire of the American Civil Liberties Union (ACLU) and other privacy advocates after an investigation found that it has been marketing powerful facial recognition tools to police. The tool, called'Rekognition', was first released in 2016, but has since been selling it on the cheap to several police departments around the country, listing the Washington County Sheriff's Office in Oregon and the city of Orlando, Florida among its customers. The ACLU and other organizations are now calling on Amazon to stop marketing the product to law enforcement, saying they could use the technology to'easily build a system to automate the identification and tracking of anyone'. Police appear to be using Rekognition to check photographs of unidentified suspects against a database of mug shots from the county jail. But privacy advocates have been concerned about expanding the use of facial recognition to body cameras worn by officers or safety and traffic cameras that monitor public areas, allowing police to identify and track people in real time.


ACLU, other rights groups urge Amazon to not sell face-recognition tech to police

The Japan Times

SEATTLE – The American Civil Liberties Union and other privacy activists are asking Amazon to stop marketing a powerful facial recognition tool to police, saying law enforcement agencies could use the technology to "easily build a system to automate the identification and tracking of anyone." The tool, called Rekognition, is already being used by at least one agency -- the Washington County Sheriff's Office in Oregon -- to check photographs of unidentified suspects against a database of mug shots from the county jail, which is a common use of such technology around the country. But privacy advocates have been concerned about expanding the use of facial recognition to body cameras worn by officers or safety and traffic cameras that monitor public areas, allowing police to identify and track people in real time. The tech giant's entry into the market could vastly accelerate such developments, the privacy advocates fear, with potentially dire consequences for minorities who are already arrested at disproportionate rates, immigrants who may be in the country illegally or political protesters. "People should be free to walk down the street without being watched by the government," the groups wrote in a letter to Amazon on Tuesday.