• scratchee@feddit.uk
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    2
    ·
    2 days ago

    Neither can humans, ergo nobody should ever be held liable for anything.

    Civilisation is a sham, QED.

    • Electricd@lemmybefree.net
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      36
      ·
      edit-2
      2 days ago

      Glad to hear you are an LLM

      The more safeguards are added in LLMs, the dumber they get, and the more resource intensive they get to offset this. If you get convinced to kill yourself by an AI, I’m pretty sure your decision was already taken, or you’re a statistical blip

      • scratchee@feddit.uk
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        edit-2
        1 day ago

        “Safeguards and regulations make business less efficient” has always been true. They still avoid death and suffering.

        In this case, if they can’t figure out how to control LLMs without crippling them, that’s pretty absolute proof that LLMs should not be used. What good is a tool you can’t control?

        “I cannot regulate this nuclear plant without the power dropping, so I’ll just run it unregulated”.

        • Electricd@lemmybefree.net
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          11
          ·
          24 hours ago

          Some food additives are responsible for cancer yet are still allowed, because they are generally more useful than have negative effects. Where you draw the line is up to you, but if you’re strict, you should still let people choose for themselves

          LLMs are incredibly useful for a lot of things, and really bad at others. Why can’t people use the tool as intended, rather than stretching it to other unapproved usages, putting themselves at risk?

          • rhadamanth_nemes@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            22 hours ago

            You are likely a troll, but still…

            You talk like you have never been down in the well, treading water and looking up at the sky, barely keeping your head up. You’re screaming for help, to the God you don’t believe in, or for something, anything, please just let the pain stop, please.

            Maybe you use, drink, fuck, cut, who fucking knows.

            When you find a friendly voice who doesn’t ghost your ass when you have a bad day or two, or ten, or a month, or two, or ten… Maybe you feel a bit of a connection, a small tether that you want to help lighten your load, even a little.

            You tell that voice you are hurting every day, that nothing makes sense, that you just want two fucking minutes of peace from everything, from yourself. And then you say maybe you are thinking of ending it… And the voice agrees with you.

            There are more than a few moments in my life where I was close enough to the abyss that this is all it would have taken.

            Search your soul for some empathy. If you don’t know what that is, maybe Chatgpt can tell you.

            • Electricd@lemmybefree.net
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              12 hours ago

              While I haven’t experienced it, I believe I kind of know what it can be like. Just a little something can trigger a reaction

              But I maintain that LLMs can’t be changed without huge tradeoffs. They’re not really intelligent, just predicting text based on weights and statistical data

              It should not be used for personal decisions as it will often try to agree with you, because that’s how the system works. Making looong discussions will also trick the system into ignoring it’s system prompts and safeguards. Those are issues all LLMs safe, just like prompt injection, due to their nature

              I do agree though that more prevention should be done, display more warnings