Teens Sue xAI Over Sexualized Images Generated by Grok

Earlier this year, Twitter was flooded with non-consensual nude images of people, including children, generated by xAI’s Grok. It turns out, the problem wasn’t limited to just Twitter. According to a lawsuit filed Monday in the Northern District of California and first reported by the Washington Post, a group of teenagers is suing Elon Musk’s AI company over allegations that a person used xAI’s model to generate sexualized images and videos of them. The case is the first instance in which minors have pursued legal action against the companies enabling the generation of non-consensual sexual material. The class action suit—brought by three plaintiffs, including two minors—alleges that xAI knowingly designed, marketed, and profited from the use of its image and video generation model, which was used to create sexually explicit material of people, including more than 18 girls who were harassed in the case that ultimately led to this lawsuit. They also allege that xAI failed to implement child sexual abuse material prevention measures that are otherwise considered an industry standard protection. At the core of the case against xAI is a truly harrowing situation for these teenage girls, who were reportedly harassed by an individual who spent months generating and distributing sexualized images of them. The perpetrator, who was arrested in December following a police investigation, according to the Washington Post, reportedly took photos and videos from the social media accounts of the girls and used them to generate nude and sexually explicit images of them. Those images were then sold and traded across communities on Discord and Telegram, where they have continued to persist. Some of the girls became aware of the images after being contacted on social media and told they were being spread. When the police arrested the person responsible for making the images, they determined that he used Grok to create them. Grok was also used to generate non-consensual sexual images of people on Twitter, an estimated 23,000 photos that appeared to depict children in sexual situations, according to researchers who investigated the posts. At the time those images were spreading on Twitter, xAI (and Twitter) CEO Elon Musk, claimed, “I not aware of any naked underage images generated by Grok. Literally zero,” and said “When asked to generate images, it will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state.” At the time, Grok was being used to depict people, including children, in bikinis without their consent. Musk made posts following this trend, including an image depicting a rocket in a bikini—seemingly suggesting he was aware of the trend, whether or not he was aware it was being used on images of children. Weeks later, the company announced that it would add restrictions to image generation and made reference to people who “attempt to abuse the Grok account to violate the law,” but didn’t directly acknowledge the generation of CSAM. Musk and xAI have also promoted Grok’s ability to be used for sexually explicit activity via its “Spicy” mode, which can be used for text, image, and video generation. The class action suit alleges that the company and its CEO were more aware of how the tool was being used than they have let on, claiming they “saw a business opportunity: an opportunity to profit off the sexual predation of real people, including children.” xAI did not respond to a request for comment regarding the lawsuit.
AI Article