Goto

Collaborating Authors

 sexual violence


AI-generated 'poverty porn' fake images being used by aid agencies

The Guardian

The charity said it wanted to safeguard'the privacy and dignity of real girls'. The charity said it wanted to safeguard'the privacy and dignity of real girls'. AI-generated'poverty porn' fake images being used by aid agencies AI-generated images of extreme poverty, children and sexual violence survivors are flooding stock photo sites and increasingly being used by leading health NGOs, according to global health professionals who have voiced concern over a new era of "poverty porn". "All over the place, people are using it," said Noah Arnold, who works at Fairpicture, a Swiss-based organisation focused on promoting ethical imagery in global development. "Some are actively using AI imagery, and others, we know that they're experimenting at least."


UN 'gravely alarmed' by deteriorating situation in Sudan's el-Fasher

Al Jazeera

UN'gravely alarmed' by deteriorating situation in Sudan's el-Fasher The United Nations secretary-general has called for an immediate ceasefire in Sudan's Darfur region after a deadly drone attack on Friday killed more than 70 worshippers in el-Fasher, expressing "grave" alarm about the "rapidly deteriorating situation". "The fighting must stop now," Antonio Guterres said in a statement issued by his spokesperson on Saturday, urging the warring parties to engage in dialogue and provide humanitarian corridors, with the brutal civil war wracking the nation in its third year. El-Fasher, the capital of the North Darfur region, remains the government-backed Sudanese Armed Forces (SAF) and its allies' last major stronghold across Darfur. It has been under siege for more than a year by the paramilitary Rapid Support Forces (RSF), which launched a renewed offensive to capture the city in recent weeks. Humanitarian organisations have raised alarm about growing hunger in the city as hundreds of thousands of people remain trapped without access to food, medicine and other essentials.


At least 38 killed in drone attack on Sudan's el-Fasher: Activists

Al Jazeera

Sudanese paramilitaries have attacked the city of el-Fasher killing at least 38 people, according to local activists, while international rights groups accuse the fighters of widespread sexual violence. The local resistance committee, a volunteer group coordinating aid in el-Fasher, said on Sunday that the paramilitary Rapid Support Forces (RSF) targeted the centre of the capital of North Darfur state "with four high-explosive missiles". The massacre followed an earlier drone attack on the city's Saudi Hospital on Friday, which killed nine people and wounded 20, forcing doctors to halt operations. World Health Organization (WHO) chief Tedros Adhanom Ghebreyesus described attacks on healthcare facilities across Sudan as "deplorable" in a post on X on Saturday. The RSF and Sudan's army have been locked in a power struggle since mid-April 2023, creating one of the worst humanitarian crises, with tens of thousands killed and more than 11 million displaced.

  Country:
  Industry:

Predicting Femicide in Veracruz: A Fuzzy Logic Approach with the Expanded MFM-FEM-VER-CP-2024 Model

Medel-Ramírez, Carlos, Medel-López, Hilario

arXiv.org Artificial Intelligence

The article focuses on the urgent issue of femicide in Veracruz, Mexico, and the development of the MFM_FEM_VER_CP_2024 model, a mathematical framework designed to predict femicide risk using fuzzy logic. This model addresses the complexity and uncertainty inherent in gender based violence by formalizing risk factors such as coercive control, dehumanization, and the cycle of violence. These factors are mathematically modeled through membership functions that assess the degree of risk associated with various conditions, including personal relationships and specific acts of violence. The study enhances the original model by incorporating new rules and refining existing membership functions, which significantly improve the model predictive accuracy.


Google, Apple, and Discord Let Harmful AI 'Undress' Websites Use Their Sign-On Systems

WIRED

Major technology companies, including Google, Apple, and Discord, have been enabling people to quickly sign up to harmful "undress" websites, which use AI to remove clothes from real photos to make victims appear to be "nude" without their consent. More than a dozen of these deepfake websites have been using login buttons from the tech companies for months. A WIRED analysis found 16 of the biggest so-called undress and "nudify" websites using the sign-in infrastructure from Google, Apple, Discord, Twitter, Patreon, and Line. This approach allows people to easily create accounts on the deepfake websites--offering them a veneer of credibility--before they pay for credits and generate images. While bots and websites that create nonconsensual intimate images of women and girls have existed for years, the number has increased with the introduction of generative AI.


We're Completely Unprepared for the Deepfake Porn Boom

Slate

Last week, A.I.–generated nude images of pop superstar Taylor Swift were produced and distributed without her consent. They circulated throughout the internet, with one single post on X (née Twitter) garnering 45 million views before the site took it down. Deepfakes, as they've come to be called in recent years, often target female celebrities, but with the rise of A.I., it's easier than ever for everyday people (almost always women) to be targeted. Last year, more than 143,000 deepfake porn videos were created, according to one estimate from the independent researcher Genevieve Oh, more than every other previous year combined. That number will, in all likelihood, only continue to rise.


North Carolina police search for suspect who allegedly followed, groped victim in a residence hall

FOX News

Correspondent Griff Jenkins caught up with the singer-songwriter to discuss the inspiration behind his music. The University of North Carolina in Chapel Hill released photos of a suspect who allegedly committed a sexual assault at one of the campus's residence halls on Monday night. At about 10:40 p.m. on Monday, police put out an alert to students, faculty and staff, saying they were investigating a report of a groping or sexual assault at McClinton Residence Hall. The incident occurred at about 6:10 p.m., and according to the preliminary investigation, the suspect followed the victim into the building's lobby and stairwell. UNC Police are searching for man who allegedly followed a student into a residence hall, groped her and left on Oct. 1, 2023.


Australia urges dating apps to improve safety standards, report says 75% Australian users experience violence

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. Australia's government said Monday the online dating industry must improve safety standards or be forced to make changes through legislation, responding to research that says three-in-four Australian users suffer some form of sexual violence through the platforms. Communications Minister Michelle Rowland said popular dating companies such as Tinder, Bumble and Hinge have until June 30 to develop a voluntary code of conduct that addresses user safety concerns. The code could include improving engagement with law enforcement, supporting at-risk users, improving safety policies and practices, and providing greater transparency about harms, she said.


A deep-learning approach to early identification of suggested sexual harassment from videos

Shetye, Shreya, Maiti, Anwita, Maiti, Tannistha, Singh, Tarry

arXiv.org Artificial Intelligence

Sexual harassment, sexual abuse, and sexual violence are prevalent problems in this day and age. Women's safety is an important issue that needs to be highlighted and addressed. Given this issue, we have studied each of these concerns and the factors that affect it based on images generated from movies. We have classified the three terms (harassment, abuse, and violence) based on the visual attributes present in images depicting these situations. We identified that factors such as facial expression of the victim and perpetrator and unwanted touching had a direct link to identifying the scenes containing sexual harassment, abuse and violence. We also studied and outlined how state-of-the-art explicit content detectors such as Google Cloud Vision API and Clarifai API fail to identify and categorise these images. Based on these definitions and characteristics, we have developed a first-of-its-kind dataset from various Indian movie scenes. These scenes are classified as sexual harassment, sexual abuse, or sexual violence and exported in the PASCAL VOC 1.1 format. Our dataset is annotated on the identified relevant features and can be used to develop and train a deep-learning computer vision model to identify these issues. The dataset is publicly available for research and development.


Tinder now offers criminal background checks, but there's a big problem

The Guardian

As of this week, Tinder users will be able to run criminal background checks on their potential dates. The feature – launched in partnership with Garbo, a background check provider that aims to make public safety information more accessible – is intended to make Tinder users feel safer. But experts who specialize in sexual violence and surveillance have said the move is misguided, and risks amplifying the biases inherent in the criminal justice system. Background checks are blunt tools that gloss over some fundamental nuances, including that most people accused of sexual violence do not interact with the criminal justice system, said Albert Fox Cahn, the founder of the Surveillance Technology Oversight Project. Only 310 of 1,000 sexual assaults are reported to police, according to the anti-sexual violence organization Rainn.