Privacy fears as MILLIONS of photos used to train facial recognition AI without users' consent

Daily Mail

Many facial recognition systems are being trained using millions of online photos uploaded by everyday people and, more often than not, the photos are being taken without users' consent, an NBC News investigation has found. In one worrying case, IBM scraped almost a million photos from unsuspecting users on Flickr to build its facial recognition database. The practice not only raises privacy concerns, but also fuels fears that the systems could one day be used to disproportionately target minorities. Many facial recognition systems are being trained using millions of online photos uploaded by everyday people and, more often than not, the photos are being taken without users' consent IBM's database, called'Diversity in Faces,' was released in January as part of the company's efforts to'advance the study of fairness and accuracy in facial recognition technology.' The database was released following a study from MIT Media Lab researcher Joy Buolamwini, which found that popular facial recognition services from Microsoft, IBM and Face vary in accuracy based on gender and race.


Amazon's facial recognition software mistakes women as men and darker-skinned women as men

Daily Mail

Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. A new study from the MIT Media Lab found that Rekognition may have gender and racial biases. In particular, the software performed worse when identifying gender for females and darker-skinned females. Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. When the software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.


MIT develops algorithm that can 'de-bias' facial recognition software

Daily Mail

MIT researchers believe they've figured out a way to keep facial recognition software from being biased. To do this, they developed an algorithm that knows to scan for faces, but also evaluates the training data supplied to it. The algorithm scans for biases in the training data and eliminates any that it perceives, resulting in a more balanced dataset. MIT researchers believe they've figured out a way to keep facial recognition software from being biased. They developed an algorithm that's capable of balancing training data'We've learned in recent years that AI systems can be unfair, which is dangerous when they're increasingly being used to do everything from predict crime to determine what news we consume,' MIT's Computer Science & Artificial Intelligence Laboratory said in a statement.


Microsoft improves facial recognition software following backlash

Daily Mail

Microsoft has updated it's facial recognition technology in an attempt to make it less'racist'. It follows a study published in March that criticised the technology for being able to more accurately recognise the gender of people with lighter skin tones. The system was found to perform best on males with lighter skin and worst on females with darker skin. The problem largely comes down to the data being used to train the AI system not containing enough images of people with darker skin tones. Experts from the computing firm say their tweaks have significantly reduced these errors, by up to 20 times for people with darker faces.


Facial recognition AI built into police body cameras could lead to FALSE ARRESTS, experts warn

Daily Mail

Body cameras worn by police in the US could soon have in-built facial recognition software, sparking'serious concerns' among civil liberties groups. The controversial technology, branded'categorically unethical', would automatically scan and identify every single person law enforcement interacts with. It is intended to help officers track down suspects more effectively, but experts are worried it could lead to false arrests and suffer from inbuilt racial and other biases. If developed, the equipment could become a regular sight on the streets of cities across the world. The manufacturer behind the move has now brought together a panel of experts to discuss the implications of the'Minority Report' style technology.