• 0 Posts
  • 434 Comments
Joined 2 years ago
cake
Cake day: June 30th, 2023

help-circle



  • For me, this specific issue is more of a personal level, it’s not about me waging a war against slopgenerators, it’s about trust. Trust is gradually built item by item. My trust in a company consists of their actions over time. They make a good game - trust goes up. They commit to provide good working conditions to the workers and actually do that - trust goes up again. Them starting using slopgen reduces that trust. It reduces it enough so I don’t trust what they are saying anymore. It’s not that I’m starting war on them or whatever, but they lost enough of my trust so when they just say something, I don’t believe it outright, the way I do with any other company, because for a company the line going up is way more important that anything else, and honouring words demonstrably doesn’t put the line up. Before this shit, my trust in Larian was high enough so I might believe them publicly declaring something, but as it stands right now, I don’t anymore, and that’s kind of the extend of my approach to it.
    All the previous games of theirs I preordered, early accessed, bought the second the buying option was available. This one I wouldn’t.





  • My point is pretty simple: they said they only use LLM “for good”, but the more they get, the more insensitive they get to lie, so your “but they said [bla bla]” argument can’t hold. If they started using it for something, the only thing stopping them from using it for everything is their reputation and the desire to make a good game, and the more money is on the line, the less value that desire holds in the face of immediate profits.
    I love everything Larian did before, I’m a huge fan of the Divinity series, BG3 is still my top 5 favourite games of all time, but this doesn’t mean all that can’t go to shit, wouldn’t be the first.



  • Also, corpos are alllowed and in some cases are required to lie, even the “good ones” like Larian. And now, when they have more money then ever, they become less trustworthy than ever. This slope is very slippery. Nothing stops them from overextending their ideas, and when a lot of money involved, I can forsee “well, we need to finish quickly, and we’re already use llm anyway, let it help with the script, and since it’s already in the script no reason not to let it generate some art, and well, since it’s already everywhere, why don’t we generate the code with it”