X says it’ll now not enable individuals to make use of its Grok AI instrument to create unauthorized sexual imagery of different individuals.
On Wednesday evening, the social media platform’s Security arm issued a prolonged assertion on the matter. Within the article, the corporate stated it has “zero tolerance for any types of baby sexual exploitation, non-consensual nudity, and undesirable sexual content material.” As such, it says it’s taken motion “to take away high-priority violative content material, together with Little one Sexual Abuse Materials (CSAM) and non-consensual nudity,” in addition to cracking down on accounts that violate its guidelines in creating it. The corporate additionally says it studies accounts searching for CSAM to regulation enforcement authorities.
This assertion comes after weeks of controversy concerning individuals utilizing Grok to generate photographs of individuals in numerous states of undress, typically in bikinis however, in some circumstances, additionally within the nude. Regardless of these considerations, X waved away accountability, blaming customers for prompting the inappropriate photographs within the first place and limiting Grok picture technology to paying subscribers. In different phrases, it was immediately cashing in on this blatantly terrible materials.
The controversy led to many governments all over the world criticizing X and launching their very own investigations. Malaysia and Indonesia truly banned Grok fully, whereas the UK stated it was contemplating a ban of X as regional communications regulator Ofcom famous there had been “deeply regarding studies” of Grok sharing photographs of undressed individuals, together with kids.
Canadian AI minister Evan Solomon, nonetheless, stated Canada wasn’t contemplating a ban. Within the meantime, Canadian privateness commissioner Philippe Dufresne stated his workplace is increasing a probe into X that was launched final February. “The privateness commissioner has taken notice of the following replace from the corporate, speaking its intention to deal with the matter,” Dufresne’s workplace instructed CBC Information. “This will likely be considered by [the commissioner’s] workplace because it proceeds with this investigation.”
In the end, it’s disgraceful that it took this lengthy for X to take any motion, particularly since this situation concerned kids. However even nonetheless, it’s not precisely taken any accountability. The weblog put up nonetheless doesn’t settle for any possession for what Grok has accomplished, whereas X proprietor Elon Musk claimed on Wednesday evening that he’s seen “actually zero” bare underage pictures of kids. (Even when that have been true, it’s nonetheless an enormous downside that sexualized pictures of youngsters with minimal clothes in provocative poses are being unfold round.)
CNN additionally notes that researchers on the European non-profit AI Forensics have nonetheless noticed “inconsistencies within the remedy of pornographic content material technology” between public interactions with Grok on X and personal chat on Grok.com.
Header picture credit score: Shutterstock
MobileSyrup might earn a fee from purchases made by way of our hyperlinks, which helps fund the journalism we offer free on our web site. These hyperlinks don’t affect our editorial content material. Assist us right here.
Elevate your perspective with NextTech Information, the place innovation meets perception.
Uncover the most recent breakthroughs, get unique updates, and join with a worldwide community of future-focused thinkers.
Unlock tomorrow’s tendencies as we speak: learn extra, subscribe to our publication, and grow to be a part of the NextTech group at NextTech-news.com

