[I literally had this thought in the shower this morning so please don’t gatekeep me lol.]

If AI was something everyone wanted or needed, it wouldn’t be constantly shoved your face by every product. People would just use it.

Imagine if printers were new and every piece of software was like “Hey, I can put this on paper for you” every time you typed a word. That would be insane. Printing is a need, and when you need to print, you just print.

  • Boomer Humor Doomergod@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    edit-2
    20 hours ago

    I recently created a week-long IT training course with an AI. It got almost all of it right, only hallucinating when it came to details I had to fix. But it took a task that would have taken me a couple months to a couple weeks. So for specific applications it is actually quite useful. (Because it’s basically rephrasing what a bunch of people wrote on Reddit.)

    For this use case I would call it as revolutionary as desktop publishing. Desktop publishing allowed people to produce in a couple days what it would have taken a team of designers and copy editors to do in a couple weeks.

    Everything else I’ve used it for it’s been pretty terrible at, especially diagnosing issues. This is due particularly to the fact that it will just make shit up if it doesn’t know, so if you also don’t know you can’t just trust it and end up doing research and experimentation yourself.

    • akacastor@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      17 hours ago

      “It got almost all of it right, only hallucinating when it came to details I had to fix.”

      What does this even mean? It did a great job, the only problems were the parts I had to fix? 🤣

      • Boomer Humor Doomergod@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        16 hours ago

        Most of it was basic knowledge that it could get from its training on the web. The stuff it missed was details about things specific to the product.

        But generating 90% of the content and me just having to edit a bit is still way less work than me doing it all myself, even if it’s right the first time.

        It’s got intern-level intelligence

        • BluescreenOfDeath@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          13 hours ago

          It’s got intern-level intelligence

          The problem is, it’s not “intelligence”. It’s an enormous statistical based autocorrect.

          AI doesn’t understand math, it just knows that the next character in a string starting “2+2=” is almost unanimously “4” in all the data it’s statistically analyzed. If you try to have it solve an equation that isn’t commonly repeated, it can’t solve it. Even when you try to train it on textbooks, it doesn’t ‘learn’ the math, it tries to analyze the word patterns in the text of the book and attempts to replicate it. That’s why it ‘hallucinates’, and also why it doesn’t matter how much data you feed it, it won’t be ‘intelligent’.

          It seems intelligent because we associate intelligence with language, and LLMs mimic language in an amazing way. But it’s not ‘thinking’ the way we associate with intelligence. It’s running complex math about what word should come next in a sentence based on the other sentences of that sort it’s seen before.

          • Boomer Humor Doomergod@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            13 hours ago

            Interns aren’t that intelligent, either. But they can generate content even if they’re not intelligent and that’s helpful, too.

            Having the right answer is a lot less useful than looking like you have the right answer, sadly.

            • BluescreenOfDeath@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              33 minutes ago

              Interns aren’t that intelligent, either. But they can generate content even if they’re not intelligent and that’s helpful, too.

              An intern has the capacity to learn, an LLM does not.

              Having the right answer is a lot less useful than looking like you have the right answer, sadly.

              Only if you care about accuracy, which is 100% the problem with LLMs.