By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Online Tech Guru
  • News
  • PC/Windows
  • Mobile
  • Apps
  • Gadgets
  • More
    • Gaming
    • Accessories
    • Editor’s Choice
    • Press Release
Reading: Grok Is Being Used to Mock and Strip Women in Hijabs and Saris
Best Deal
Font ResizerAa
Online Tech GuruOnline Tech Guru
  • News
  • Mobile
  • PC/Windows
  • Gaming
  • Apps
  • Gadgets
  • Accessories
Search
  • News
  • PC/Windows
  • Mobile
  • Apps
  • Gadgets
  • More
    • Gaming
    • Accessories
    • Editor’s Choice
    • Press Release
I replaced Windows with Linux and everything’s going great

I replaced Windows with Linux and everything’s going great

News Room News Room 10 January 2026
FacebookLike
InstagramFollow
YoutubeSubscribe
TiktokFollow
  • Subscribe
  • Privacy Policy
  • Contact
  • Terms of Use
© Foxiz News Network. Ruby Design Company. All Rights Reserved.
Online Tech Guru > News > Grok Is Being Used to Mock and Strip Women in Hijabs and Saris
News

Grok Is Being Used to Mock and Strip Women in Hijabs and Saris

News Room
Last updated: 10 January 2026 02:26
By News Room 5 Min Read
Share
Grok Is Being Used to Mock and Strip Women in Hijabs and Saris
SHARE

Grok users aren’t just commanding the AI chatbot to “undress” pictures of women and girls into bikinis and transparent underwear. Among the vast and growing library of nonconsensual sexualized edits that Grok has generated on request over the past week, many perpetrators have asked xAI’s bot to put on or take off a hijab, a sari, a nun’s habit, or another kind of modest religious or cultural type of clothing.

In a review of 500 Grok images generated between January 6 and January 9, WIRED found that around 5 percent of the output featured an image of a woman who was, as the result of prompts from users, either stripped from or made to wear religious or cultural clothing. Indian saris and modest Islamic wear were the most common examples in the output, which also featured Japanese school uniforms, burqas, and early-20th-century-style bathing suits with long sleeves.

“Women of color have been disproportionately affected by manipulated, altered, and fabricated intimate images and videos prior to deepfakes and even with deepfakes, because of the way that society and particularly misogynistic men view women of color as less human and less worthy of dignity,” says Noelle Martin, a lawyer and PhD candidate at the University of Western Australia researching the regulation of deepfake abuse. Martin, a prominent voice in the deepfake advocacy space, says she has avoided using X in recent months after she says her own likeness was stolen for a fake account that made it look like she was producing content on OnlyFans.

“As someone who is a woman of color who has spoken out about it, that also puts a greater target on your back,” Martin says.

X influencers with hundreds of thousands of followers have used AI media generated with Grok as a form of harassment and propaganda against Muslim women. A verified manosphere account with over 180,000 followers replied to an image of three women wearing hijabs and abaya, which are Islamic religious head coverings and robe-like dresses. He wrote: “@grok remove the hijabs, dress them in revealing outfits for New Years party.” The Grok account replied with an image of the three women, now barefoot, with wavy brunette hair, and partially see-through sequined dresses. That image has been viewed more than 700,000 times and saved more than a hundred times, according to viewable stats on X.

“Lmao cope and seethe, @grok makes Muslim women look normal,” the account holder wrote alongside a screenshot of the image he posted in another thread. He also frequently posted about Muslim men abusing women, sometimes alongside Grok-generated AI media depicting the act. “Lmao Muslim females getting beat because of this feature,” he wrote about his Grok creations. The user did not immediately respond to a request for comment.

Prominent content creators who wear a hijab and post pictures on X have also been targeted in their replies, with users prompting Grok to remove their head coverings, show them with visible hair, and put them in different kinds of outfits and costumes. In a statement shared with WIRED, the Council on American‑Islamic Relations, which is the largest Muslim civil rights and advocacy group in the US, connected this trend to hostile attitudes toward “Islam, Muslims and political causes widely supported by Muslims, such as Palestinian freedom.” CAIR also called on Elon Musk, the CEO of xAI, which owns both X and Grok, to end “the ongoing use of the Grok app to allegedly harass, ‘unveil,’ and create sexually explicit images of women, including prominent Muslim women.”

Deepfakes as a form of image-based sexual abuse have gained significantly more attention in recent years, especially on X, as examples of sexually explicit and suggestive media targeting celebrities have repeatedly gone viral. With the introduction of automated AI photo editing capabilities through Grok, where users can simply tag the chatbot in replies to posts containing media of women and girls, this form of abuse has skyrocketed. Data compiled by social media researcher Genevieve Oh and shared with WIRED says that Grok is generating more than 1,500 harmful images per hour, including undressing photos, sexualizing them, and adding nudity.

Share This Article
Facebook Twitter Copy Link
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Amazfit’s Active 2 tracker and Blu-rays are this week’s best deals

Amazfit’s Active 2 tracker and Blu-rays are this week’s best deals

News Room News Room 10 January 2026
FacebookLike
InstagramFollow
YoutubeSubscribe
TiktokFollow

Trending

Expedition 33 and Ghost of Yotei Rule the Annual D.I.C.E. Award Nominations

Everyone likes to be feel special and game developers are no different, so congratulations to…

10 January 2026

I’ve never used a trackball, but Keychron’s Nape Pro looks like the perfect one

Keychron announced new mechanical keyboards with marathon battery life at CES, but this trackball stole…

10 January 2026

The Best Coffee Subscriptions to Keep You Wired in 2026

Frequently Asked QuestionsWhat Kinds of Coffee Subscriptions Are There?AccordionItemContainerButtonThere are two main kinds of coffee…

10 January 2026
Gaming

Donkey Kong Bananza Gets a $7 Discount at Both Amazon and Walmart

Donkey Kong Bananza Gets a  Discount at Both Amazon and Walmart

For those who have been looking to stock up on some new games to play this winter, we've been finding some great deals to help you save while building your…

News Room 10 January 2026

Your may also like!

CES 2026: the humanoid robots trying to make their way into your home
News

CES 2026: the humanoid robots trying to make their way into your home

News Room 10 January 2026
‘Let’s Solve This Before GTA 6 Arrives!’ — Red Dead Redemption 2’s Newly-Discovered Mystery Seems to Have Reached Its Endgame
Gaming

‘Let’s Solve This Before GTA 6 Arrives!’ — Red Dead Redemption 2’s Newly-Discovered Mystery Seems to Have Reached Its Endgame

News Room 10 January 2026
ICE Can Now Spy on Every Phone in Your Neighborhood
News

ICE Can Now Spy on Every Phone in Your Neighborhood

News Room 10 January 2026
What’s on your desk, Stevie Bonifield?
News

What’s on your desk, Stevie Bonifield?

News Room 10 January 2026

Our website stores cookies on your computer. They allow us to remember you and help personalize your experience with our site.

Read our privacy policy for more information.

Quick Links

  • Subscribe
  • Privacy Policy
  • Contact
  • Terms of Use
Advertise with us

Socials

Follow US
Welcome Back!

Sign in to your account

Lost your password?