"Computers have been getting better and better at seeing movement on video. How is it that they read lips, follow a dancing girl or copy an actor making faces?"
– from Andrew Blake. Introduction to Active Contours and Visual Dynamics. Visual Dynamics Group, Department of Engineering Science, University of Oxford
Facial recognition technology is all around us--it's at concerts, airports, and apartment buildings. But its use by law enforcement agencies and courtrooms raises particular concerns about privacy, fairness, and bias, according to Jennifer Lynch, the Surveillance Litigation Director at the Electronic Frontier Foundation. Some studies have shown that some of the major facial recognition systems are inaccurate. Amazon's software misidentified 28 members of Congress and matched them with criminal mugshots. These inaccuracies tend to be far worse for people of color and women.
Jimmy Gomez is a California Democrat, a Harvard graduate and one of the few Hispanic lawmakers serving in the US House of Representatives. But to Amazon's facial recognition system, he looks like a potential criminal. Gomez was one of 28 US Congress members falsely matched with mugshots of people who've been arrested, as part of a test the American Civil Liberties Union ran last year of the Amazon Rekognition program. Nearly 40 percent of the false matches by Amazon's tool, which is being used by police, involved people of color. This is part of a CNET special report exploring the benefits and pitfalls of facial recognition.
China's rapidly evolving surveillance technologies have snared their share of fugitives in recent years. Most of these cases have involved facial recognition cameras, which can detect individual facial features regardless of glasses, hats or masks. There were the 80-odd wanted suspects picked out of crowds of tens of thousands of fans at concerts by Jacky Cheung, a legendary Hong Kong pop star. In April this year, a genius student wanted on suspicion of killing his mother was caught after being on the lam for almost four years. He was nabbed within 10 minutes of entering Chongqing airport, local media outlet Southern Metropolis Daily reported.
Fox News Flash top headlines for Sept. 12 are here. Check out what's clicking on Foxnews.com California could soon become the largest state to ban the use of facial recognition technology in law enforcement body cameras, a significant milestone in the regulation of the burgeoning technology. The State Assembly on Thursday passed AB 1215, a bill that would impose a three-year moratorium on the technology, garnering praise from privacy and civil liberties advocates. The legislation now heads to Gov. Gavin Newsom's desk.
Fox News Flash top headlines for Sept. 11 are here. Check out what's clicking on Foxnews.com A majority of Americans trust law enforcement agencies to use facial recognition technology responsibly, and say it's OK to use the controversial tool to keep public spaces secure, according to a new survey. However, the Pew Research Center survey also found that acceptance of the technology for law enforcement does not extend to other areas. In terms of its responsible use, technology companies and advertisers only received support from 36 percent and 18 percent of respondents in the survey.
Google's Nest Hub Max has caused quite a stir. Google's latest smart display brings with it a controversial new feature that's always watching. Face Match, introduced on the Google Nest Hub Max, uses the smart display's front-facing camera as a security feature and a way to participate in video calls. It also shows you your photos, texts, calendar details and so on when it recognizes your face. This mode of facial recognition sounds simple enough at first.
In this three part blog series, Elizabeth and Jen will be focusing on the ethics of AI and data. So often, we hear how we have to be data-driven, but it is not enough. We need to be human-centred. In this three part series, we will look at important topics such as DeepFakes, Bias and AI, why these phenomenon happen, and what we can do about it. At the beginning of 2019, Alexandria Ocasio-Cortez made headlines by stating that algorithms can perpetuate racism.
We worried that these databases would contain bad data or bad assumptions, and in particular that they might inadvertently and unconsciously encode the existing prejudices and biases of our societies and fix them into machinery. We worried people would screw up. That is, we worried what would happen if these systems didn't work and we worried what would happen if they did work. We're now having much the same conversation about AI in general (or more properly machine learning) and especially about face recognition, which has only become practical because of machine learning. And, we're worrying about the same things - we worry what happens if it doesn't work and we worry what happens if it does work.
Apple has launched its latest iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max featuring new cameras, improved screens, faster processors and longer battery life. Announced at Apple's headquarters in Cupertino, California on Thursday, the iPhone 11 series of smartphones carries on where the iPhone XS, XS Max and XR left off last year. The new iPhones feature similar designs and screen sizes with slim bezels and the Face ID notch at the top, which was first introduced in 2017 with the iPhone X. A variety of new colours will also be available, with a new matt finish on the back of the iPhone 11 Pro. The iPhone Pro will be available in two sizes, with either a 5.8-inch (147mm) or 6.5-inch (165mm) screen.