• pahlimur@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 hours ago

    I’ll take a step back. These LLM models are interesting. They are being trained in interesting new ways. They are becoming more ‘accurate’, I guess. ‘Accuracy’ is very subjective and can be manipulated.

    Machine learning is still the same though.

    LLMs still will never expand beyond their inputs.

    My point is it’s not early anymore. We are near or past the peak of LLM development. The extreme amount of resources being thrown at it is the sign that we are near the end.

    That sub should not be used to justify anything, just like any subreddit at any point in time.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      6 hours ago

      My point is it’s not early anymore. We are near or past the peak of LLM development.

      I think we’re just going to have to agree to disagree on this part.

      I’ll agree though that IF what you’re saying is true, then they won’t succeed.

      • pahlimur@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        Fair enough. I’d be fine being wrong.

        Improved efficiency would reduce the catastrophic energy demands LLMs will have in the future. Assuming your reality comes true it would help reduce their environmental impact.

        We’ll see. This isn’t first “it’s the future” technology I’ve seen and I’m barely 40.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          I just wanted to add one other thing on the hardware side.

          These H200’s are power hogs, no doubt about it. But the next generation H300 or whatever it is, will be more efficient as the node process (or whatever its called) gets smaller and the hardware is optimized and can run things faster. I could still see NVIDIA coming out and charging more $/flop or whatever the comparison would be though even if it is more efficient power wise.

          But that could mean that the electricity costs to run these models starts to drop if they truly are plateaued. We might not be following moores law on this anymore (I don’t actually know), but were not completely stagnant either.

          So IF we are plateaued on this one aspect, then costs should start coming down in future years.

          Edit: but they are locking in a lot of overhead costs at today’s prices which could ruin them.