• adarza@lemmy.ca
    link
    fedilink
    English
    arrow-up
    16
    ·
    14 days ago

    AGI (artificial general intelligence) will be achieved once OpenAI has developed an AI system that can generate at least $100 billion in profits

    nothing to do with actual capabilities… just the ability to make piles and piles of money.

    • LostXOR@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      14 days ago

      Guess we’re never getting AGI then, there’s no way they end up with that much profit before this whole AI bubble collapses and their value plummets.

      • hemmes@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        14 days ago

        AI (LLM software) is not a bubble. It’s been effectively implemented as a utility framework across many platforms. Most of those platforms are using OpenAI’s models. I don’t know when or if that’ll make OpenAI 100 billion dollars, but it’s not a bubble - this is not the .COM situation.

        • lazynooblet@lazysoci.al
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 days ago

          The vast majority of those implementations are worthless. Mostly ignored by it’s intended users, seen as a useless gimmick.

          LLM have it’s uses but companies are pushing them into every areas to see what sticks at the moment.

          • Benjaben@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 days ago

            Not the person you replied to, but I think you’re both “right”. The ridiculous hype bubble (I’ll call it that for sure) put “AI” everywhere, and most of those are useless gimmicks.

            But there’s also already uses that offer things I’d call novel and useful enough to have some staying power, which also means they’ll be iterated on and improved to whatever degree there is useful stuff there.

            (And just to be clear, an LLM - no matter the use cases and bells and whistles - seems completely incapable of approaching any reasonable definition of AGI, to me)

            • Auli@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              13 days ago

              I think people misunderstand a bubble. The .com bubble happened but the internet was useful and stayed around. The AI bubble doesn’t mean AI isn’t useful just that most of the chaf well disapear.

              • kbal@fedia.io
                link
                fedilink
                arrow-up
                1
                ·
                13 days ago

                The dotcom bubble was based on technology that had already been around for ten years. The AI bubble is based on technology that doesn’t exist yet.

        • Auli@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          13 days ago

          It’s a bubble. It doesn’t mean the tech does not have its uses. And it is exactly like the .com situation.

          • suy@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            13 days ago

            I think that “exactly like” it’s absurd. Bubbles are never “exactly” like the previous ones.

            I think in this case there is a clear economical value in what they produce (from the POV of capitalism, not humanity’s best interests), but the cost is absurdly huge to be economically viable, hence, it is a bubble. But in the dot com bubble, many companies had a very dubious value in the first place.

            • skulblaka@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              13 days ago

              there is a clear economical value in what they produce

              There is clear economic value in chains of bullshit that may or may not ever have a correct answer?

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      14 days ago

      The context here is that OpenAI has a contract with Microsoft until they reach AGI. So it’s not a philosophical term but a business one.

    • Harvey656@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      13 days ago

      For fun, I used this idea in chatgpt using similar logic:

      Hello, can you explain to me how a strict profit generated number would result in a specific goal, such as if I make 1000 dollars I would suddenly be capable of owning a house. I find the idea strange and backwards.

      You’re correct to find this idea odd—it reflects a misunderstanding of how financial goals, like owning a house, relate to income. Here’s why this thinking is flawed:

      1. Profit vs. Net Worth: Making $1,000 in profit doesn’t directly equate to being able to afford a house because buying a house requires not just income but also savings, creditworthiness, and a down payment. Profit is only a small piece of the puzzle.

      2. Costs and Expenses: Owning a house involves more than just the purchase price. You need to consider closing costs, property taxes, maintenance, and potentially ongoing mortgage payments. $1,000 would likely not be enough to cover even a fraction of these.

      3. Leverage and Financing: Most people don’t buy houses outright; they use loans or mortgages. To secure a mortgage, lenders look at your total financial picture: income, savings, debt, and credit score. Simply generating $1,000 won’t improve these factors significantly.

      4. Specificity of the Goal: “Making $1,000” is a general financial milestone, not a specific strategy tailored to the requirements of owning a home. A more actionable goal would involve budgeting, saving for a down payment, and improving credit.

      5. The Backwards Logic: The idea that reaching an arbitrary profit threshold suddenly makes you eligible for something like owning a house conflates short-term gains with long-term financial stability. A better approach is to reverse-engineer the goal: start by determining the total cost of the house, then work backwards to figure out how much you need to save monthly, how much income you need to earn, and what steps are required to secure financing.

      Example of Goal-Oriented Thinking:

      If you wanted to own a $200,000 house:

      You might need a 10% down payment ($20,000).

      Closing costs could add $5,000–$10,000.

      Monthly income would need to support a mortgage payment (let’s say $1,200/month).

      Steps would include saving regularly, reducing debt, and improving credit, rather than focusing on an isolated profit milestone like $1,000.

      Summary:

      Focusing on a single, arbitrary profit number like $1,000 doesn’t align with the comprehensive planning required for significant financial goals like home ownership. Instead, success depends on a holistic view of your finances and structured goal-setting.