Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.
Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.
Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that’s really good at turning blurry faces into that particular person’s face.
Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.
Cheers for the explanation, had no idea that’s how it works.
So it’s even worse than @[email protected] thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!
AI can generate images of things that don’t even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.
There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don’t necessarily need to use illegal material to train to get illegal output.
You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.
People have also put dots on people’s clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.