☆ Yσɠƚԋσʂ ☆

  • 553 Posts
  • 530 Comments
Joined 6 years ago
cake
Cake day: January 18th, 2020

help-circle




  • That’s like asking what’s the difference between a chef who has memorized every recipe in the world and a chef who can actually cook. One is a database and the other has understanding.

    The LLM you’re describing is just a highly sophisticated autocomplete. It has read every book, so it can perfectly mimic the syntax of human thought including the words, the emotional descriptions, and the moral arguments. It can put on a flawless textual facade. But it has no internal experience. It has never burned its hand on a stove, felt betrayal, or tried to build a chair and had it collapse underneath it.

    AGI implies a world model which is an internal, causal understanding of how reality works, which we build through continous interaction with it. If we get AGI, then it’s likely going to come from robotics. A robot learns that gravity is a real, it learns that “heavy” isn’t an abstract concept but a physical property that changes how you move. It has to interact with its environment, and develop a predictive model that allows it to accomplish its tasks effectively.

    This embodiment creates a feedback loop LLMs completely lack: action -> consequence -> learning -> updated model. An LLM can infer from the past, but an AGI would reason about the future because it operates with the same fundamental rules we do. Your super-LLM is just a library of human ghosts. A real AGI would be another entity in the world.






















  • I genuinely don’t know what you’re arguing anymore, because your logic is completely backwards. You’re blaming the GPL for “enshitification” and bloat, which is utterly nonsensical. The license has fuck all to do with how lean or bloated a piece of software is, that’s a result of developer priorities and corporate roadmaps. The GPL’s entire purpose is to enforce freedom, and a key part of that freedom is the right to fork a project and strip out the bloat yourself if the main version goes off the rails. You then admit that corporate contributions are valuable, but your proposed solution is to letting them keep their work proprietary which is the very thing that accelerates enshitification. You’re arguing that to stop companies from making software worse, we should give them a free pass to take public labor, build their own walled gardens, and contribute nothing back. That’s just corporate apologia that encourages the exact freeloading the GPL was designed to prevent. Your entire point is a self-contradictory mess.