Goto

Collaborating Authors

 intimate image abuse


Tech firms will have 48 hours to remove abusive images under new law

BBC News

Tech platforms would have to remove intimate images which have been shared without consent within 48 hours, under a proposed UK law. The government said tackling intimate image abuse should be treated with the same severity as child sexual abuse material (CSAM) and terrorist content. Failure to abide by the rules could result in companies being fined up to 10% of their global sales or have their services blocked in the UK. Janaya Walker, interim director of the End Violence Against Women Coalition, said the welcome and powerful move... rightly places the responsibility on tech companies to act. The proposals are being made through an amendment to the Crime and Policing Bill, which is making its way through the House of Lords.


UK to ban deepfake AI 'nudification' apps

BBC News

The UK government says it will ban so-called nudification apps as part of efforts to tackle misogyny online. New laws - announced on Thursday as part of a wider strategy to halve violence against women and girls - will make it illegal to create and supply AI tools letting users edit images to seemingly remove someone's clothing. The new offences would build on existing rules around sexually explicit deepfakes and intimate image abuse, the government said. Women and girls deserve to be safe online as well as offline, said Technology Secretary Liz Kendall. We will not stand by while technology is weaponised to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes.


'I don't take no for an answer': how a small group of women changed the law on deepfake porn

The Guardian

Charlotte Owen: 'The Lords were blown away by these brilliant women.' Charlotte Owen: 'The Lords were blown away by these brilliant women.' 'I don't take no for an answer': how a small group of women changed the law on deepfake porn For Jodie*, watching the conviction of her best friend, and knowing she helped secure it, felt at first like a kind of victory. It was certainly more than most survivors of deepfake image-based abuse could expect. They had met as students and bonded over their shared love of music. In the years since graduation, he'd also become her support system, the friend she reached for each time she learned that her images and personal details had been posted online without her consent.


'Would love to see her faked': the dark world of sexual deepfakes - and the women fighting back

The Guardian

It began with an anonymous email. "I'm genuinely so, so sorry to reach out to you," it read. Beneath the words were three links to an internet forum. "Huge trigger warning … They contain lewd photoshopped images of you." Jodie (not her real name) froze.