• 1 Post
  • 655 Comments
Joined 2 years ago
cake
Cake day: September 24th, 2023

help-circle


  • Even if there are tight time constraints, you won’t sacrifice quality, because that would make you slower.

    Too right. People find this so hard to understand. I think they dramatically underestimate the payback time on technical debt.

    I am currently working in a startup that has the classic “we’re a startup, quality doesn’t matter” attitude. They think that they might not be around in a year so it’s best to go fast and not give a shit about tech debt.

    In my experience that attitude bites in under 6 months. I’m already wasting entire days sorting out messes that they neglected to deal with.






  • Everyone is talking past each other because there are so many different ways of using AI and so many things you can use it for. It works ok for some, it fails miserably for others.

    Lots of people only see one half of that and conclude “it’s shit” or “it’s amazing” based on an incomplete picture.

    The devs you respect probably aren’t working on crud apps and landing pages and little hacky Python scripts. They’re probably writing compilers and game engines or whatever. So of course it isn’t as useful for them.

    That doesn’t mean it doesn’t work for people mocking up a website or whatever.











  • Right, I’m not saying it isn’t simpler in terms of syntax. The point I was making is that the syntax is simpler but in a way that makes it worse because while it’s easier for computers to read, it’s harder for humans.

    it was only later discovered that they can be compiled down to native code.

    That sounds extremely unlikely. I think you’re misinterpreting this quote (which is fair enough; it’s not very clear):

    Steve Russell said, look, why don’t I program this eval … and I said to him, ho, ho, you’re confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it. That is, he compiled the eval in my paper into IBM 704 machine code, fixing bugs, and then advertised this as a Lisp interpreter, which it certainly was. So at that point Lisp had essentially the form that it has today …

    As far as I can tell Lisp was always intended to be compiled and executed. That quote is about compiling the eval() function (which was just meant to explain how Lisp is executed) into a binary and using that as an interpreter.

    Also I skimmed the paper that is from, and in fact Lisp was intended to be targeted by AI (in the same way that we get AI to write and execute Python to solve problems), which explains a lot. It wasn’t designed for humans to write, so why bother with nice syntax; just have the machine write the AST directly!

    (I expect that was only part of the motivation tbf, but still!)


  • This comment perfect captures why I don’t like Lisp. Essentially “it’s simple, this easy to read code transforms to this AST”. Lisp basically says “we can make parsing way easier if we force programmers to write the AST directly!” which is really stupid because computers can perfectly well parse syntax that is easy for humans to read and turn it into ASTs automatically.

    It makes it easier to parse for computers at the cost of being much harder to parse for humans, which is really the wrong choice in most cases. (The exception is if you’re DIYing your compiler, e.g. if you’re teaching how to write a compiler then Lisp is a good target.)