• JcbAzPx@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    18 hours ago

    I think it would help to define the causal relationship you’re referring to. What exactly about skin color changes the outcome of the test?

    • quick_snail@feddit.nl
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      18 hours ago

      It doesn’t. The test was designed for white people. It’s the same reason that facial recognition has more false positives for people with more melanin in their skin.

      The problem is the test, not the skin. That’s my point.

        • usrtrv@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          16 hours ago

          It’s more about cultural and socioeconomic bias. So you might have questions that have a bias towards middleclass suburban kids vs a poor innercity kid.

          You can find examples online, but some can be quite subtle. “Banana is to yellow as Ruby is to ______” Someone who grew up with jewelry would be more likely to know this. This goes against the principle of an IQ test, and older tests were notorious for this.

      • CeeBee_Eh@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        14 hours ago

        It’s the same reason that facial recognition has more false positives for people with more melanin in their skin.

        This is not the real reason. It’s because camera tech from more than 10 years ago was worse than today and had trouble with anything less than ideal lighting conditions. Darker textures reflect less light, so the darker someone’s skin the less details a camera can see.

        However we’re still talking about a 0.001 FMR for white men to a 0.002 FMR for black men. That’s “2x more false matches” but it’s a 0.001 difference.

        With modern cameras and recent facial recognition tech, the issue in differences of skin colour is virtually non-existent. Yes, I know of the news stories about false arrests in recent years, but no tech is perfect and you’re talking about a few instances out of billions.

        No, I’m not defending the use of the tech, just pointing out facts.