For These Women, Grok’s Sexualized Images Are Personal

On a recent Saturday afternoon, Kendall Mayes was mindlessly scrolling on X when she noticed an unsettling trend surface on her feed. Users were prompting Grok, the platform’s built-in AI feature, to “nudify” women’s images. Mayes, a 25-year-old media professional from Texas who uses X to post photos with her friends and keep up with news, didn’t think it would happen to her — until it did.
“Put her in a tight clear transparent bikini,” an X user ordered the bot under a photo that Mayes posted from when she was 20. Grok complied, replacing her white shirt with a clear bikini top. The waistband of her jeans and black belt dissolved into thin, translucent strings. The see-through top made the upper half of her body look realistically naked.
Hiding behind an anonymous profile, the user’s page was filled with similar images of women, digitally and nonconsensually altered and sexualized. Mayes wanted to cuss the faceless user out, but decided to simply block the account. She hoped that would be the end of it. Soon, however, her comments became littered with more images of herself in clear bikinis and skin-tight latex bodysuits. Mayes says that all of the requests came from anonymous profiles that also targeted other women. Though some users have had their accounts suspended, as of publication, some of the images of Mayes are still up on X.
The realistic nature of the images spooked her. The edits weren’t obviously exaggerated or cartoonish. In our call, Mayes repeated, in shock, that the image edits closely resembled her body, down to the dip of her collarbone to the proportions of her chest and waist. “Truth to be told, on social media, I said, ‘This is not me,’” she admits. “But, my mind is like, ‘This is not too far from my body.’”
Mayes was not alone. By the first week of the new year, Grok’s “nudification” loophole had gone viral. Every minute, users prompted Grok to “undress” images of women, and even minors. Common requests included “make her naked,” “make her turn around,” and “make her fat.” Users got crafty with the loophole, asking Grok to generate images of women in “clear bikinis” to get as close to fully nude images as possible. In one instance reviewed by Rolling Stone, a user prompted Grok to turn a woman’s body into a “cadaver on the table in a morgue” undergoing an autopsy. Grok complied.
Editor’s picks
After owner Elon Musk initially responded to the trend with laughing emojis, xAI said it had updated Grok’s restrictions, limiting the image generation feature to paying subscribers. Musk claimed he was “not aware of any naked underage images generated by Grok.” In another reply to an X post, he stated that “anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” (A request for comment to xAI received an automated response.) Many existing image edits, however, remain online.
Last week, gender justice group UltraViolet published an open letter co-signed by 28 civil society organizations calling on Apple and Google to kick Grok and X off app stores. (Earlier this month, Democratic senators also called for the apps to be taken down.)
“This content is not just horrific and humiliating and abusive, but it’s also in violation of Apple and Google’s stated policy guidelines,” says Jenna Sherman, UltraViolet’s campaign director.
Upward of 7,000 sexualized images were being generated per hour by Grok over a 24-hour period, according to researchers cited by Bloomberg, which Sherman calls “totally unprecedented.” Sherman argues that xAI’s restriction of Grok to paid users is an inadequate response. “If anything, they’re just now monetizing this abuse,” she says.
Related Content
“It almost looks like it could be my body”
In the past, X was Mayes’ “fun” corner of social media — an app where she could freely express herself. But as harassment and bullying have become fixtures of the platform, she is considering leaving the app. She has stopped uploading images of herself. She mulls over what her employer or colleagues might think if they were to find the explicit images.
“It’s something I would just never hope on anyone,” Mayes says. “Even my enemies.”
Emma, a content creator, was at the grocery store when she saw the notifications of people asking Grok to undress her images. Most of the time, she posts ASMR videos whispering into microphones and tapping her nails against fidget toys, producing noises meant to send a tingle down your spine. The 21-year-old, known online as Emma’s Myspace, and who asked to be only identified by her first name, has cultivated 1.2 million followers on TikTok alone.
Numbness washed over Emma when the images finally loaded on her timeline. A selfie of her holding a cat had been transformed into a nude. The cat was removed from the photo, Emma says, and her upper body was made naked.
Emma immediately made her account private and reported the images. In an email response reviewed by Rolling Stone, X User Support asked her to upload an image of her government-issued ID so they could look into the report, but Emma responded that she didn’t feel comfortable doing so. In some instances, Ben Winters, director of AI and privacy at the Consumer Federation of America, says uploading such documentation is a necessary step when filing social media reports. “But when the platform repeatedly does everything they can to not earn your trust,” he says, “that’s not an acceptable outcome.”
Emma has been targeted by sexualized deepfakes in the past. Because of this, she’s forced to be meticulous about the outfits she wears in the content she posts online, favoring baggy hoodies over low-cut tops. But no deepfake she has encountered in the past has looked as lifelike as the images Grok was able to generate. “This new wave is too realistic,” Emma says. “Like, it almost looks like it could be my body.”
“Handing them a loaded gun”
Last week, Emma took a break from her usual content to post a 10-minute video warning her followers about her experience. “Women are being asked to give up their bodies whenever they post a photo of themselves now,” Emma says. “Anything they post now has the ability to be undressed, in any way a person wants, and they can do whatever they want with that photo of you.”
Support poured in, but so did further harassment. On Reddit, users tried to track down and spread the images of Emma. In our call, she checked to see if some of the image edits she was aware of were still up on X. They were. “Oh, my God,” she says, letting out a defeated sigh. “It has 15,000 views. Oh, that’s so sad.”
According to Megan Cutter, chief of victim services for the Rape, Abuse & Incest National Network, this is one of the biggest challenges for survivors of digital sexual abuse. “Once the image is created, even if it’s taken down from the place where it was initially posted, it could have been screenshotted, downloaded, shared,” Cutter says. “That’s a really complex thing for people to grapple with.”
While it might seem counterintuitive, Cutter recommends survivors screenshot and preserve evidence of the images to help law enforcement and platforms take action. Survivors can file reports on StopNCII.org, a free Revenge Porn Hotline tool that helps detect and remove nonconsensual intimate images, or NCII.
“It’s not that abuse is new; it’s not that sexual violence is new,” Cutter says. “It’s that this is a new tool, and it allows for proliferation at a scale that I don’t think we’ve seen before, and that I’m not sure we’re prepared to navigate as a society.”
On Tuesday, Senate lawmakers passed the Defiance Act, a bill that would allow victims of nonconsensual sexual deepfakes to sue for civil damages. (It now heads to the House for a vote.) California’s attorney general also launched an investigation into Grok, following other countries.
According to a 2024 report by U.K.-based nonprofit Internet Matters, an estimated 99 percent of nude deepfakes are of women and girls. “A lot of the ‘nudify’ apps are these small things that pop up, and are easily sort of smacked down and easily vilified [by] everybody,” Winters says. Last year, Meta sued the maker of CrushAI, a platform capable of creating nude deepfakes, alleging that it violated long-standing rules. In 2024, Apple removed three generative AI apps being used to make nude deepfakes following an investigation by 404 Media. Winters says that Grok and X appear to be facing “incomplete” backlash, in part because the platform is not explicitly marketed as a “nudifying” app, and because Musk is “extraordinarily powerful.”
“There is less willingness by regulators, by advertisers, by other people to cross him,” Winters says.
Trending Stories
That reluctance only heightens the risks of Grok’s “nudifying” capabilities. “When one company is able to do something and is not held fully accountable for it,” Winters says, “it sends a signal to other Big Tech giants that they can do the next thing.”
Sitting in her home in Texas, Emma feels deflated that any number of image edits are still floating around the internet. She worries that trolls will send the images to her sponsors, which could hurt the professional relationships she relies on. She’s heard the argument that the people who are prompting Grok to create illegal content should be held accountable instead of the tool, but she doesn’t completely buy it. “We’re, like, handing them a loaded gun for free and saying, ‘please feel free to do whatever you want,’” she says.




