Grok under fire after complaints it undressed minors in photos
Elon Musk's Grok on Friday said it was scrambling to fix flaws in the artificial intelligence tool after users claimed it turned pictures of children or women into erotic images.
"We've identified lapses in safeguards and are urgently fixing them," Grok said in a post on X, formerly Twitter.
"CSAM (Child Sexual Abuse Material) is illegal and prohibited."
Complaints of abuses began hitting X after an "edit image" button was rolled out on Grok in late December.
The button allows users to modify any image on the platform -- with some users deciding to partially or completely remove clothing from women or children in pictures, according to complaints.
Grok maker xAI, run by Musk, replied to an AFP query with a terse, automated response that said: "the mainstream media lies."
The Grok chatbot, however, did respond to an X user who queried it on the matter, after they said that a company in the United States could face criminal prosecution for knowingly facilitating or failing to prevent the creation or sharing of child porn.
Media outlets in India reported on Friday that government officials there are demanding X quickly provide them details of measures the company is taking to remove "obscene, nude, indecent, and sexually suggestive content" generated by Grok without the consent of those in such pictures.
The public prosecutor's office in Paris meanwhile expanded an investigation into X to include new accusations that Grok was being used for generating and disseminating child pornography.
The initial investigation against X was opened in July following reports that the social network's algorithm was being manipulated for the purpose of foreign interference.
Grok has been criticized in recent months for generating multiple controversial statements, from the war in Gaza and the India-Pakistan conflict to antisemitic remarks and spreading misinformation about a deadly shooting in Australia.
mng-clw-bl-gc/jgc
X.Quintero--LGdM