Commissioner calls for ban on apps that make deepfake nude images of children
Artificial intelligence "nudification" apps that create deepfake sexual images of children should be immediately banned, amid growing fears among teenage girls that they could fall victim, the children's commissioner for England is warning. Girls said they were stopping posting images of themselves on social media out of a fear that generative AI tools could be used to digitally remove their clothes or sexualise them, according to the commissioner's report on the tools, drawing on children's experiences. Although it is illegal to create or share a sexually explicit image of a child, the technology enabling them remains legal, the report noted. "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps," the commissioner, Dame Rachel de Souza, said.
Apr-28-2025, 08:00:56 GMT
- Country:
- Europe > United Kingdom > England (0.25)
- Industry:
- Health & Medicine > Therapeutic Area
- Psychiatry/Psychology (0.30)
- Information Technology > Security & Privacy (0.68)
- Law > Criminal Law (0.52)
- Health & Medicine > Therapeutic Area
- Technology: