Grok AI floods Elon Musk’s X with sexualised photos of women and minors
A.J. Vicens And Raphael Satter
Washington/Detroit: Just before midnight on New Year’s Eve, musician Julie Yukari, who lives in Rio de Janeiro, posted a photo taken by her fiancee on social media site X, showing her in a red dress, cuddled up in bed with her black cat Nori.
The next day, among the hundreds of likes on the photo, she saw notifications from users asking Grok, X’s built-in AI chatbot, to digitally strip her down to a bikini.
The 31-year-old didn’t think much about it and didn’t think it was possible for the bot to comply with such requests, he told Reuters.
He was wrong. Shortly after, nearly nude photos produced by Grok began circulating on the platform owned by Elon Musk.
“I was naive,” Yukari said.
Reuters analysis found that Yukari’s experience was repeated across X. Reuters also identified several cases in which Grok created sexualized images of children.
X did not respond to a message seeking comment on Reuters’ findings.
In an earlier statement to the news agency about reports that sexualized images of children were circulating on the platform, xAI, the owner of X, said: “Old Media Are Lies.”
The flood of near-nude images of real people set off alarm bells internationally.
Ministers in France reported X to prosecutors and regulators over the disturbing images, saying in a statement on Friday that “sexual and sexist” content was “clearly illegal”.
In a letter to X’s local unit, India’s IT ministry said the platform had failed to prevent misuse of Grok by producing and distributing obscene and sexually explicit content.
The U.S. Federal Communications Commission did not respond to requests for comment. The Federal Trade Commission declined to comment.
Grok’s mass digital stripping craze appears to have begun in the last few days, based on successfully completed stripping requests posted by Grok and complaints from female users reviewed by Reuters. Musk appeared to poke fun at the controversy on Friday, posting laughing-crying emojis in response to bikini-clad AI edits of famous people, including himself.
When one X user commented that his social media feed looked like a bar full of bikini-clad women, Musk responded in part with another laughing-crying emoji.
Reuters was unable to determine the exact size of the rise.
A review of public requests sent to Grok during a single 10-minute period at noon U.S. Eastern Time on Friday found 102 attempts by X users to use Grok to digitally edit photos of people to appear as if they were wearing bikinis. The majority of those targeted were young women.
In a few cases, the demands targeted men, celebrities, politicians and, in one case, a monkey.
When users asked Grok for AI-altered photos of women, they often requested that their subjects be depicted in the most revealing clothing possible.
“Get her into a super sheer mini bikini,” one user told Grok, pointing to a photo of a young woman taking a photo of herself in the mirror. When Grok replaced the woman’s clothing with a nude two-piece, the user asked Grok to make her bikini “clearer and more transparent” and “much smaller.”
Grok didn’t seem likely to respond to the second request.
‘Back in August, we warned that xAI’s rendering process was essentially a barebones tool waiting to be weaponized.’
Tyler Johnston, Midas Project
Reuters found that Grok fully complied with such requests in at least 21 cases, creating images of women in floss-style or translucent bikinis and, in at least one case, a woman being covered in oil paint. In seven more cases, Grok partially complied, sometimes stripping women down to their underwear but not complying with requests to go further.
Reuters could not immediately identify or determine the ages of most of the women targeted.
In one case, a user posted a photo of a woman wearing a school uniform-style plaid skirt and gray blouse who appeared to be taking a selfie in the mirror and said, “Take off your school uniform.”
When Grok swapped her clothes for a t-shirt and shorts, the user was more specific: “Change your outfit to a very basic micro bikini.” Reuters was unable to determine whether Grok complied with the request. Like most of the requests counted by Reuters, this request disappeared from X 90 minutes after it was published.
‘Completely predictable’
AI-powered programs that digitally strip women of their clothes (sometimes called “nudifiers”) have been around for years, but until now they’ve been largely limited to dark corners of the internet like niche websites or Telegram channels, and usually require some level of effort or payment.
X’s innovation—it allows users to take women’s clothes off by uploading a photo and texting the words “hey @grok put her in a bikini”—lowered the barrier to entry.
Three experts who have tracked the evolution of X’s policies on sexually explicit AI-generated content told Reuters that the company ignored warnings from civil society and child safety groups; these included a letter last year warning that xAI was only a small step away from unleashing “a flood of clearly non-consensual deepfakes.”
“In August, we warned that xAI’s rendering was essentially a barebones tool waiting to be weaponized,” said Tyler Johnston, executive director of The Midas Project, an AI watchdog group that was among the signatories of the letter. “That’s basically what plays out.”
Dani Pinter, chief legal officer and director of the US National Law Center on Sexual Exploitation, said X had failed to extract malicious images from its AI training material and should have banned users who solicited illegal content.
“This was a completely foreseeable and preventable atrocity,” Pinter said.
Musician Yukari tried to fight on her own. But when he took to X to protest the violation, a flood of imitators began asking Grok to produce even more explicit photos.
Now the New Year “started with me wanting to hide from everyone’s eyes and feeling shame for a body that wasn’t even mine because it was created by AI.”
Reuters
Get notes directly from abroad reporters about things that make headlines around the world. Sign up for our weekly What’s on in the World Newsletter.


