For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its “Extended Thinking” version) to find an error in “Today’s featured article”. In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.

  • chronicledmonocle@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    3
    ·
    13 hours ago

    Congrats. You just burned down 4 trees in the rainforest for every article you had an LLM analyze.

    LLMs can be incredibly useful, but everybody forgets how much of an environmental nightmare this shit is.

    • GooseFinger@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      Had to look up Chat GPT’s energy usage because you made me curious.

      Seems like Open AI claims Chat GPT 4o uses about 0.34 Wh per “query.” This is apparently consistent with third party estimates. The average Google search is about 0.03 Wh, for reference.

      Issue is, “query” isn’t defined, and it’s possible this figure is the energy consumption of the GPUs alone, omitting additional sources that comprise the full picture (energy conversion loss, cooling, infrastructure, etc.). It’s also unclear if this figure was obtained during model training, or during normal use.

      I also briefly saw that Chat GPT 5 uses between 18-40 Wh per query, so 100x more than GPT 4o. The OP used GPT 5.

      It sounds like the energy consumption is relatively bad no matter how it’s spun, but consider that it replaces other forms of compute and reduces workload for people, and the net energy tradeoff may not be that bad. Consider the task from the OP - how much longer/how many more people would it take to accomplish the same result that GPT 5 and the lone author accomplished? I bet the net energy difference isn’t that far from zero.

      Here’s the article I found: https://towardsdatascience.com/lets-analyze-openais-claims-about-chatgpt-energy-use/

    • Pika@rekabu.ru
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      12 hours ago

      Not much when you use an already trained model, actually.

      • SoftestSapphic@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        12 hours ago

        Unfortunately unless you are hosting your own, or using like DeepSeek which had a cutoff on its training data, then it is a perpetually training model.

        When you ask ChatGPT things it is horrible for the world. It digs us a little deeper into an unsalvageable situation that will probably make us go extinct