Imagine a game like “the sims” where you can adjust how autonomous the sims you control are. I could see Ai being used to control that.

Or having an elder scroll game were you just respond however you want and the npc adapts to it.

    • SGforce@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      6 hours ago

      They would increase requirements significantly and be generally pretty bad and repetitive. It’s going to take some time before that happens.

      • Pika@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 hours ago

        games already have pretty extensive requirements for function, one model running for all NPC’s wouldn’t be that bad i dont think. it would raise ram requirements by maybe a gig or 2 and likely slow down initial loading time as the model has to load in. I’m running a pretty decent model on my home server which does the duties of a personified char and the CT im running ollama on only has 3 gigs allotted to it. And thats not even using the GPU which would speed it up tremendously

        I think the bigger problem would be testing wise that would be a royal pain in the butt to manage, having to make a profile/backstory for every char that you want running on the LLM. You would likely need a boilerplate ruleset, and then make a few basic rules to model it after. But the personality would never be the same player to player nor would it be accurate, like for example I can definitly see the model trying to give advice that is impossible for the Player to actually do as it isn’t in the games code.

        • hayvan@piefed.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          5 hours ago

          That would be crazy expensive for the studios. LLM companies are selling their services at a loss at the moment.

          • lmmarsano@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            crazy expensive

            Citation missing, so unconvincing. We’re not talking about a general purpose LLM here. Are pretrained, domain-specific LLMs or SLMs “crazy expensive” to run?