Goto

Collaborating Authors

 fake porn


AI can now create fake porn, making revenge porn even more complicated

#artificialintelligence

In January this year, a new app was released that gives users the ability to swap out faces in a video with a different face obtained from another photo or video – similar to Snapchat's "face swap" feature. It's an everyday version of the kind of high-tech computer-generated imagery (CGI) we see in the movies. You might recognise it from the cameo of a young Princess Leia in the 2016 Star Wars film Rogue One, which used the body of another actor and footage from the first Star Wars film created 39 years earlier. Now, anyone with a high-powered computer, a graphics processing unit (GPU) and time on their hands can create realistic fake videos – known as "deepfakes" – using artificial intelligence (AI). The problem is that these same tools are accessible to those who seek to create non-consensual pornography of friends, work colleagues, classmates, ex-partners and complete strangers – and post it online. Read more: The picture of who is affected by'revenge porn' is more complex than we first thought In December 2017, Motherboard broke the story of a Reddit user known as "deep fakes", who used AI to swap the faces of actors in pornographic videos with the faces of well-known celebrities.


The Purge of AI-Assisted Fake Porn Has Begun

#artificialintelligence

It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning. Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself. This type of fake porn, also referred to as deepfakes, involves mapping someone else's face onto a porn star's body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene.


Fake porn is the new fake news, and the internet isn't ready

Engadget

Ever since Facebook finally admitted to having a fake news problem, it's been trying to fix it. It hired thousands of people to help block fake ads, pledged to work with third-party fact-checking organizations and is busy building algorithms to detect fake news. But even as it attempts to fight back against fraudulent ads and made-up facts, another potential fake news threat looms on the horizon: Artificially generated fake video. Motherboard recently uncovered a disturbing new trend on Reddit, where users create AI-generated pornographic clips by swapping other people's faces onto porn stars. The outlet first reported on the phenomenon a month ago when Reddit user "deepfakes" posted a video of Gal Gadot's face swapped onto a porn star's body (he's since created more fake porn with other celebrities).


AI fake porn could cast any of us

#artificialintelligence

In the case of revenge porn, people often ask: If the photos weren't taken in the first place, how could ex-partners, or hackers who steal nude photos, post them? We are now in the age of fake porn. Fake, as in, famous people's faces – or, for that matter, anybody's face – near-seamlessly stitched onto porn videos. As Motherboard reports, you can now find actress Jessica Alba's face on porn performer Melanie Rios' body, actress Daisy Ridley's face on another porn performer's body and Emma Watson's face on an actress's nude body, all on Celeb Jihad – a celebrity porn site that regularly posts celebrity nudes, including stolen/hacked ones. The word "appears" is key.