OAKLAND, Jan. 16, 2026 — California Attorney General Rob Bonta today sent xAI a cease and desist letter, demanding the company take immediate action to stop the creation and distribution of deepfake, nonconsensual, intimate images and child sexual abuse material (CSAM). The creation, distribution, publication, and exhibition of CSAM is a crime. These business practices are also in violation of California’s civil laws. Earlier this week, Attorney General Bonta announced opening an investigation into the proliferation of nonconsensual, sexually explicit material produced using Grok, an AI model developed by xAI. xAI appears to be facilitating the large-scale production of these images that are being used to harass women and girls across the internet, including via the social media platform X.
“This week, my office formally announced an investigation into the creation and spread of nonconsensual, sexually explicit material produced using Grok, an AI model developed by xAI. The avalanche of reports detailing this material — at times depicting women and children engaged in sexual activity — is shocking and, as my office has determined, potentially illegal,” said Attorney General Bonta. “Today, I sent xAI a cease and desist letter, demanding the company immediately stop the creation and distribution of deepfake, nonconsensual, intimate images and child sexual abuse material. The creation of this material is illegal. I fully expect xAI to immediately comply. California has zero tolerance for child sexual abuse material.”
Attorney General Bonta, in his capacity as the chief law enforcement officer of California, demands that xAI immediately cease and desist from:
- Creating, disclosing, or publicizing digitized sexually explicit material portraying the depicted individual when the depicted individual did not consent to its creation or disclosure or was a minor when the material was created.
- Facilitating or aiding and abetting the creation, disclosure, or publication of digitized sexually explicit material portraying the depicted individual when the depicted individual did not consent to its creation or disclosure or was a minor when the material was created.
- Creating, facilitating, or aiding and abetting the creation, distribution, or publication of any image, including but not limited to digitally altered or artificial intelligence-generated matter, that involves or depicts a person under 18 years of age or what appears to be a person under 18 years of age engaging in or simulating sexual conduct.
The actions above violate California law, including California Civil Code section 1708.86, California Penal Code sections 311 et seq. and 647(j)(4), and California Business & Professions Code section 17200. The California Department of Justice expects xAI to take immediate action to address these issues and provide confirmation to the Department of the steps it is taking to address these issues within the next five days.
Background:
Over the past weeks, there have been numerous news reports of Grok users taking ordinary images of women and children available on the internet and using Grok to depict these people in suggestive and sexually explicit scenarios and “undress” them, all without the subjects’ knowledge or consent. xAI developed Grok’s image generation models to include what the company calls a “spicy mode,” which generates explicit content. The company has used this mode as a marketing point for the company, and it has unsurprisingly resulted in the proliferation of content that sexualizes people without their consent.
Grok-generated images are being used to harass public figures and ordinary social media users alike. Most alarmingly, news reports have described the use of Grok to alter images of children to depict them in minimal clothing and sexual situations. xAI has also reportedly produced photorealistic images of children engaged in sexual activity. Use of Grok for these purposes appears to be happening on a large scale. According to one analysis, more than half of the 20,000 images generated by xAI between Christmas and New Year’s depicted people in minimal clothing, and some of those appeared to be children.
