

Theoretically you could include the original signed unprocessed image (or make it available NFT-style) and let the viewer decide whether the difference to post-processed image is reasonable or unreasonable.
It would however make it impossible to partially censor images without giving away the non-AI proof, unless you had a trusted third party ™ verify the original and re-sign the censored version.
A ‘view cryptographically signed original’ button next to every instagram post would be complete LOL, though.
The point is that any unsigned image is assumed to be AI generated. You can absolutely strip the metadata or convert it to some other format (there’s always the analog hole and it has to become a bitmap to be displayed) but then you’ve lost the proof you took it.
You’d still need secure key storage hardware and trust roots in the camera like TPMs but every phone has that already…
(This is referring to the ‘signed in camera’ model)