Celebrity Deepfake Porn Cases Will Be Investigated by Meta Oversight Board

WIRED 

As AI tools become increasingly sophisticated and accessible, so too has one of its worst applications: non-consensual deepfake pornography. While much of this content is hosted on dedicated sites, more and more it's finding its way onto social platforms. Today, the Meta Oversight Board announced that it was taking on cases that could force the company to reckon with how it deals with deepfake porn. The board, which is an independent body that can issue both binding decisions and recommendations to Meta, will focus on two deepfake porn cases, both regarding celebrities who had their images altered to create explicit content. In one case about an unnamed American celebrity, deepfake porn depicting the celebrity was removed from Facebook after it had already been flagged elsewhere on the platform.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found