As a Java engineer in the web development industry for several years now, having heard multiple times that X is good because of SOLID principles or Y is bad because it breaks SOLID principles, and having to memorize the “good” ways to do everything before an interview etc, I find it harder and harder to do when I really start to dive into the real reason I’m doing something in a particular way.

One example is creating an interface for every goddamn class I make because of “loose coupling” when in reality none of these classes are ever going to have an alternative implementation.

Also the more I get into languages like Rust, the more these doubts are increasing and leading me to believe that most of it is just dogma that has gone far beyond its initial motivations and goals and is now just a mindless OOP circlejerk.

There are definitely occasions when these principles do make sense, especially in an OOP environment, and they can also make some design patterns really satisfying and easy.

What are your opinions on this?

  • Valmond@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    16 hours ago

    So we should not have #defines in the way, right?

    Like INT32, instead of “int”. I mean if you don’t know the size you probably won’t do network protocols or reading binary stuff anyways.

    uint64_t is good IMO, a bit long (why the _t?) maybe, but it’s not one of the atrocities I’m talking about where every project had its own defines.

    • Feyd@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      14 hours ago

      “int” can be different widths on different platforms. If all the compilers you must compile with have standard definitions for specific widths then great use em. That hasn’t always been the case, in which case you must roll your own. I’m sure some projects did it where it was unneeded, but when you have to do it you have to do it

      • Valmond@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 hours ago

        So show me two compatible systems where the int has different sizes.

        This is folklore IMO, or incompatible anyways.

    • Hetare King@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      The standard type aliases like uint64_t weren’t in the C standard library until C99 and in C++ until C++11, so there are plenty of older code bases that would have had to define their own.

      The use of #define to make type aliases never made sense to me. The earliest versions of C didn’t have typedef, I guess, but that’s like, the 1970s. Anyway, you wouldn’t do it that way in modern C/C++.

    • xthexder@l.sw0.com
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      15 hours ago

      I’ve seen several codebases that have a typedef or using keyword to map uint64_t to uint64 along with the others, but _t seems to be the convention for built-in std type names.