• katy ✨@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    3 hours ago

    they obviously did if they banned him for it; and if they’re training on csam and refuse to do anything about it then yeah they have a connection to it.

    • Devial@discuss.online
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      3 hours ago

      Also, the data set wasn’t hosted, created, or explicitly used by Google in any way.

      It was a common data set used in various academic papers on training nudity detectors.

      Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it’s content ? Because that’s sure what it feels like reading your comments…

    • Devial@discuss.online
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 hours ago

      So you didn’t read my comment then did you ?

      He got banned because Google’s automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn’t even a manual decision to ban him.

      His ban had literally nothing whatsoever to do with the fact that the CSAM was part of an AI training data set.