WASHINGTON – Apple will let you unlock the iPhone X with your face -- a move likely to bring facial recognition to the masses, along with concerns over how the technology may be used for nefarious purposes. Apple's newest device, set to go on sale on Friday, is designed to be unlocked with a facial scan with a number of privacy safeguards -- as the data will only be stored on the phone and not in any databases. Unlocking one's phone with a face scan may offer added convenience and security for iPhone users, according to Apple, which claims its "neural engine" for FaceID cannot be tricked by a photo or hacker. While other devices have offered facial recognition, Apple is the first to pack the technology allowing for a three-dimensional scan into a hand-held phone. But despite Apple's safeguards, privacy activists fear the widespread use of facial recognition would "normalize" the technology and open the door to broader use by law enforcement, marketers or others of a largely unregulated tool.
About a week ago, Stanford University researchers posted online a study on the latest dystopian AI: They'd made a machine learning algorithm that essentially works as gaydar. After training the algorithm with tens of thousands of photographs from a dating site, the algorithm could, for example, guess if a white man in a photograph was gay with 81 percent accuracy. They wanted to protect gay people. "[Our] findings expose a threat to the privacy and safety of gay men and women," wrote Michal Kosinski and Yilun Wang in the paper. They built the bomb so they could alert the public about its dangers.
Wikipedia is a community-created encyclopedia that contains information about notable people from different countries, epochs and disciplines and aims to document the world's knowledge from a neutral point of view. However, the narrow diversity of the Wikipedia editor community has the potential to introduce systemic biases such as gender biases into the content of Wikipedia. In this paper we aim to tackle a sub problem of this larger challenge by presenting and applying a computational method for assessing gender bias on Wikipedia along multiple dimensions. We find that while women on Wikipedia are covered and featured well in many Wikipedia language editions, the way women are portrayed starkly differs from the way men are portrayed. We hope our work contributes to increasing awareness about gender biases online, and in particular to raising attention to the different levels in which gender biases can manifest themselves on the web.