Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees
A substantial number of AI images generated or edited with Grok are targeting women in religious and cultural clothing. Among the vast and growing library of nonconsensual sexualized edits that Grok has generated on request over the past week, many perpetrators have asked xAI's bot to put on or take off a hijab, a saree, a nun's habit, or another kind of modest religious or cultural type of clothing. In a review of 500 Grok images generated between January 6 and January 9, WIRED found around 5 percent of the output featured an image of a woman who was, as the result of prompts from users, either stripped from or made to wear religious or cultural clothing. Indian sarees and modest Islamic wear were the most common examples in the output, which also featured Japanese school uniforms, burqas, and early 20th century-style bathing suits with long sleeves. "Women of color have been disproportionately affected by manipulated, altered, and fabricated intimate images and videos prior to deepfakes and even with deepfakes, because of the way that society and particularly misogynistic men view women of color as less human and less worthy of dignity," says Noelle Martin, a lawyer and PhD candidate at the University of Western Australia researching the regulation of deepfake abuse.
Jan-10-2026, 00:23:08 GMT
- Country:
- Europe
- North America > United States
- California (0.04)
- Oceania > Australia
- Western Australia (0.24)
- South America > Venezuela (0.04)
- Industry:
- Information Technology > Security & Privacy (1.00)
- Law (1.00)
- Technology: