Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it’s no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.
Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it’s no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.
I thought the real problem is that it is generating *illegal porn.
Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I’m sure they’re working on it (it’s actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there’s always going to be an infinite number of ways to trick it since LLMs aren’t actually intelligent).
Porn itself is not illegal.
He has 100% control over the ability to alter or pull this product. If he’s leaving it up while he’s generating illegal pornography that is on him.
And no s*** I’m concerned about the illegal stuff.