- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
The AI tool Grok is estimated to have generated approximately 3 million sexualized images, including 23,000 that appear to depict children, after the launch of a new image editing feature powered by the tool on X, according to new analysis of a sample of images.[1]
The image-generating feature exploded in popularity on December 29th, shortly after Elon Musk announced a feature enabling X users to use Grok to edit images posted to the platform with one click.[2] The feature was restricted to paid users on January 9th in response to widespread condemnation of its use for generating sexualized images, with further technical restrictions on editing people to undress them added on January 14th.[3]
Elon is a white child raper or sympathizer. He’s probably jacking his non dick to a 6 mo old being diaper changed
Grok has no agency. Elon played a heavy role in the design of the tool, promotes and profits off its use, and has failed to stop users from producing this material with what can only be considered as a feature of his software
You‘re probably giving the guy too much credit. I think they use a version of FLUX. Replacing stuff is one of it‘s core features.
Although he might‘ve ordered to add NSFW related concepts because that‘s definitely what he uses it for almost exclusively.
I’m not hearing where you disagree. You want me to add more nuance? For what, exactly? To absolve the world’s richest man for his CSAM machine?
Why aren’t I seeing “Grok floods X with naked, sexualised images of Trump and Musk” yet?
Why aren’t we seeing the person in charge in jail yet?
Right? It seems like they usually show up when there are naked children around.
I’m so out of the loop after deleting my Twitter account a while ago. Is it now just a porn generator instead of a website?
X.com is a porn website as the name suggests. I know, shocking.



