• very_well_lost@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    8 hours ago

    people will stop using it for all the things they’re currently using it for

    They will when AI companies can no longer afford to eat their own costs and start charging users a non-subsidized price. How many people would keep using AI if it cost $1 per query? $5? $20?

    OpenAI lost $5 billion last year. Billion, with a B. Even their premium customers lose them money on every query, and eventually the faucet of VC cash propping this whole thing up is gonna run dry when investors inevitably realize that there’s no profitable business model to justify this technology. At that point, AI firms will have no choice but to pass their costs on to the customer, and there’s no way the customer is going to stick around when they realize how expensive this technology actually is in practice.

    • thaklor@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 hours ago

      I remember this happening with Uber too. All that VC money dried up, their prices skyrocketed, people stopped using them, and they went bankrupt. A tale as old as time.

      • AFaithfulNihilist@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        A lot of those things have a business model that relies on putting the competition out of business so you can jack up the price.

        Uber broke taxis in a lot of places. It completely broke that industry by simply ignoring the laws. Uber had a thing that it could actually sell that people would buy.

        It took years before it started making money, in an industry that already made money.

        LLMs Don’t even have a path to profitability unless they can either functionally replace a human job or at least reliably perform a useful task without human intervention.

        They’ve burned all these billions and they still don’t even have something that can function as well as the search engines that proceeded them no matter how much they want to force you to use it.

    • Womble@piefed.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      8 hours ago

      There are free open models you can go and download right now, that are better than SOTA 12-18 months ago, and that cost you less to run on a gaming PC than playing COD does. Even if openai, anthropic et al disappeared without a trace tomorrow AI wouldnt go away.

      • baggachipz@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        And those are useful tools, which will always be around. The current “AI” industry bubble is predicated on total world domination by an AGI, which is not technically possible given the underpinnings of the LLM methodology. Sooner or later, the people with the money will realize this. They’re stupid, so it may take a while.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      8 hours ago

      I run local LLMs and they cost me $0 per query. I don’t plan to charge myself more than that at any point, even if the AI bubble bursts.

      • Nephalis@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 hours ago

        Realy? I get what you want to say, but at least the power consumption of the machine you need the model to run on will be yours forever. Depending on your energy price it is not 0 per query.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          7 hours ago

          It’s so near zero it makes no difference. It is not a noticeable factor in my decision on whether to use it or not for any given task.

          The training of a brand new model is expensive, but once the model has been created it’s cheap to run. If OpenAI went bankrupt tomorrow and shut down the models it had trained would just be sold off to other companies and they’d run them instead, free from the debt burden that OpenAI accrued from the research and training costs that went into producing them. That’s actually a fairly common pattern for first-movers like that, they spend a lot of money blazing the trail and then other companies follow along afterwards and eat their lunch.

          • wewbull@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            It’s cheap to run for one person. Any service running it isn’t cheap when it has a good number of users.

      • very_well_lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        8 hours ago

        That’s great if they actually work. But my experience with the big, corporate-funded models has been pretty freaking abysmal after more than a year of trying to adopt them into my daily workflow. I can’t imagine the performance of local models is better when they’re running on much, much smaller datasets and with much, much less computing power.

        I’m happy to be proven wrong, of course, but I just don’t see how it’s possible for local models to compete with the Big Boys in terms of quality… and the quality of the largest models is only middling at best.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          7 hours ago

          You’re free to not use them. Seems like an awful lot of people are using them, though, including myself. They must be getting something out of using them or they’d stop too.

          • expr@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            Just because a lot of people are using them does not necessarily mean they are actually valuable. You’re claim assumes that people are acting rationally regarding them. But that’s an erroneous assumption to make.

            People are falling in “love” with them. Asking them for advice about mental health. Treating them like they are some kind of all-knowing oracle (or even having any intelligence whatsoever), when in reality they know nothing and cannot reason at all.

            Ultimately they are immensely effective at creating a feedback loop that preys on human psychology and reinforces a dependency on it. It’s a bit like addiction in that way.