Amazon is recording people's faces and saving the footage to identify them at a later date, in a move likely to concern privacy conscious users. The site wants to use five-second snippets of video to record a person's face if they wish to become a seller on the online marketplace. It may be able to use its facial recognition technology, called Rekognition, to identify users. It is believed to be in a bid to crack down on fake sellers and counterfeit goods being peddled on the site. The method was first spotted by a Vietnamese seller who revealed the programme requires users to open their webcam and record themselves before they become verified.
Her research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. Facial recognition technology was already seeping into everyday life -- from your photos on Facebook to police scans of mugshots -- when Joy Buolamwini noticed a serious glitch: Some of the software couldn't detect dark-skinned faces like hers. That revelation sparked the Massachusetts Institute of Technology researcher to launch a project that's having an outsize influence on the debate over how artificial intelligence should be deployed in the real world. Her tests on software created by brand-name tech firms such as Amazon uncovered much higher error rates in classifying the gender of darker-skinned women than for lighter-skinned men.
MIT researchers believe they've figured out a way to keep facial recognition software from being biased. To do this, they developed an algorithm that knows to scan for faces, but also evaluates the training data supplied to it. The algorithm scans for biases in the training data and eliminates any that it perceives, resulting in a more balanced dataset. MIT researchers believe they've figured out a way to keep facial recognition software from being biased. They developed an algorithm that's capable of balancing training data'We've learned in recent years that AI systems can be unfair, which is dangerous when they're increasingly being used to do everything from predict crime to determine what news we consume,' MIT's Computer Science & Artificial Intelligence Laboratory said in a statement.
Research suggests Amazon's facial analysis algorithms have struggled with gender and racial bias. The MIT Media Lab found Rekognition had no trouble at all in correctly pinpointing the gender of lighter-skinned men but it classified women as men almost a fifth of the time and darker-skinned women as men on almost one out of three occasions. IBM and Microsoft software performed better than Amazon's tool -- Microsoft's solution mistakenly thought darker-skinned women were men 1.5 percent of the time.
NEW YORK - Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits. The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time.