Over the last few days the #faceappchallenge has taken over social media. This "challenge" involves downloading a selfie-editing tool called FaceApp and using one of its filters to digitally age your face. You then post the photo of your wizened old self on the internet and everyone laughs uproariously. You get a small surge of dopamine from gathering a few online likes before existential ennui sets in once again. On Monday, as the #faceappchallenge went viral, Joshua Nozzi, a software developer, warned people to "BE CAREFUL WITH FACEAPP….it Some media outlets picked this claim up and privacy concerns about the app began to mount. Concern escalated further when people started to point out that FaceApp is Russian. "The app that you're willingly giving all your facial data to says the company's location is in Saint-Petersburg, Russia," tweeted the New York Times's Charlie Warzel. And we all know what those Russians are like, don't we? They want to harvest your data for nefarious ...
The viral face-transforming FaceApp climbed to the top of the App Store on Wednesday. The viral face-transforming FaceApp climbed to the top of the App Store on Wednesday. The growing popularity of FaceApp, a photo filter app that delights smartphone users with its ability to transform the features of any face, like tacking on years of wrinkles, has prompted Democratic Sen. Chuck Schumer to call for a federal investigation into the Russia-based company over what he says are potential national security and privacy risks to millions of Americans. "It would be deeply troubling if the sensitive personal information of U.S. citizens was provided to a hostile foreign power actively engaged in cyber hostilities against the United States," Schumer said in a letter to the FBI and the Federal Trade Commission. "I ask that the FTC consider whether there are adequate safeguards in place to prevent the privacy of Americans using this application, including government personnel and military service members, from being compromised," the senator wrote.
FaceApp by Wireless Lab OOO gets you looking old, young, or even the opposite gender. Seems like nearly everyone on Twitter is accepting the #FaceAppChallenge by posting aged photos of themselves. They are using FaceApp, a downloadable program available on Apple's App store and the Google Play store, which lets you apply filters to your photos to transform your appearance – to make you look younger or older, have a different look, or even more masculine or feminine. Those can be shared online and on Twitter, Facebook and other social media sites. FaceApp, which uses artificial intelligence to create "neural face transformations," first gained prominence in spring 2017.
Artificial intelligence technology is advancing and bringing opportunities for society but also profound challenges for individual freedom. AI is a powerful enabler of surveillance technology, such as facial recognition, and many countries are grappling with appropriate rules for use, weighing the security benefits against privacy risks. Authoritarian regimes, however, lack strong institutional mechanisms to protect individual privacy--a free and independent press, civil society, an independent judiciary--and the result is the widespread use of AI for surveillance and repression. This dynamic is most acute in China, where the Chinese government is pioneering new uses of AI to monitor and control its population. China has already begun to export this technology along with laws and norms for illiberal uses to other nations.
We'll be talking about everyone's favorite topic at the moment: facial recognition. First San Francisco, Somerville ... now Oakland: California's Oakland has become the third US city to ban its local government using facial recognition technology, after its council passed an ordinance this week. Council member Rebecca Kaplan submitted the ordinance for city officials to consider earlier this year in June. The document describes the shortcomings of the technology and why it should be banned. "The City of Oakland should reject the use of this flawed technology on the following basis: 1) systems rely on biased datasets with high levels of inaccuracy; 2) a lack of standards around the use and sharing of this technology; 3) the invasive nature of the technology; 4) and the potential abuses of data by our government that could lead to persecution of minority groups," according to the ordinance.