UK regulator wants to ban apps that can make deepfake nude images of children

Engadget 

The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal. "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone -- a stranger, a classmate, or even a friend -- could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children's Commissioner Dame Rachel de Souza.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found