It doesn’t take that much energy and power to run an LLM or an image generator, and sure, it would take a lot with so many users connecting, across so many servers… but there’s just no way they’re not mining bitcoin. My math might not be mathing but it seems like AI doesn’t justify the power use and it seems like everybody’s lying.

Someone who knows more about this inform me of your opinions

  • JakenVeina@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    2 hours ago

    The big power use is in building/training models, not running them. Not the running them is insignificant either.

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    44 minutes ago

    They are both a waste of power. Just because one is less of a waste doesn’t make it a good choice.

  • sbird@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 hour ago

    Running LLMs do take a lot of power. Trying to run a very small model on my old laptop makes the fans go crazy and power draw must be high given that it gets very hot. Then when you get models with billions upon billions of parameters, you can see how it all adds up. Then you multiply these humongous models with millions of users, as well as consider all the energy required to cool all those servers (fans, pumps, etc).

    The biggest power draw in the AI industry is probably training the models, given that it’s quite literally going through the entire internet for every crumb of data and mixing it all together into coherent text and images. Then you see how power grids can be taken down by these power hungry data centers.

    • sbird@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      If you have seen any videos of people visiting these data centers, you can hear the incredibly loud fans that make it difficult to hear the presenter speaking.