• wuffah@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    edit-2
    10 hours ago

    I’m of the strong opinion that we control the media that we are exposed to and that the resolution for problematic or undesirable media is to simply turn it off.

    However: advertising, LLM’s, social media, and the Internet have forced me to capitulate that certain forms of media constitute a legitimate memetic hazard, and are capable of fueling addiction, misinformation, and general misery on large enough scales. I hate this conclusion because while I still heavily err on the side of media liberty and self-control, I cannot square that value with the reality of poisonous, hostile mass media.

    We should not be subjected to predatory practices to enjoy the products and services that we depend on, and the entertainment that is part of our shared experience and culture. Loot boxes, advertising, and financial scams are becoming nearly universal in popular gaming products, and even software in general. To me, this eventually constitutes a monopolistic behavior that becomes reasonably unavoidable and must be regulated.

    • RightHandOfIkaros@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      9 hours ago

      To be fair, much of the memetic hazard posed by various technologies is not actually the fault of the technologies, but a fault of the person having no self-control, no accountability for their own actions, or having some form of undiagnosed medical issue they are unaware of.

      Its like saying video games cause school shootings: the problem isnt the video games, its the person. The video games are an excuse to shift blame and accountability away from the person.

      • ngdev@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        what you say has merit, however its akin to saying that the memetic hazard posed by heroin isnt the fault of heroin. like, sure. heroin is just a substance. certain software is similar, but its made to be a certain way (dark patterns in gaming etc) and should be regulated for harm reduction just like addictive substances

        • RightHandOfIkaros@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          Okay, but if we take care of the problem that people have, legal regulation would not be necessary. We wouldn’t have to have a trillion laws stipulating all the various minutae of what we should or shouldn’t do because of how harmful it is or isn’t, people would be able to figure this out on their own. Less laws in general is better, when the population is intelligent enough to understand that you don’t drink bleach because a computer screen showed those words to you in that order.

          Opiates wouldn’t need to be illegal because people would be intelligent enough to know how harmful it is and thus wouldn’t use it. A law wouldn’t need to be created listing every known or unknown opiate derivative that is banned or for whatever use. People would just be smart enough to know.

          Basically, too many people aren’t using their own brain. AI is definitely a helpful tool, but not if you’re an idiot and believe it to have any actual intelligence. Its not there to replace your doctor or teacher, it is there to help you with word processing, pattern recognition, or other such language based tasks. AI used as a tool is queried for things like “check this passage for overly repetitive terms and suggest improvements that keep the same meaning.” AI used by an idiot is queried for things like “what do my lab results say about my health?”

          I suppose this is too far advanced for humanity at this point. Laws are important, but too many laws begins to speak about a general decline in intelligence.