• morto@piefed.social
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 hours ago

    some people (…) are asking “can you game on DDR3“? The answer is a shocking yes.

    “shocking”. Really?

    Browsing the internet as a third worlder always give me these eye-rolling moments. Sigh…

  • TryingSomethingNew@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    30 minutes ago

    The question is for companies like Ubisoft and EA which usually design games for what PCs are going to be when a game comes out. And since the games industry was bigger than the movies industry before it collapsed due to Covid, what’s that going to do to the economy?

  • ChicoSuave@lemmy.world
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    1
    ·
    4 hours ago

    “Can’t compete with the global super rich? Lower your standards and be happy!”

    • Zos_Kia@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 hours ago

      Just because they’ve trained you to believe you need the latest 2nm chips (which is conveniently their highest margin product) doesn’t mean you really need them.

    • zen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      23 minutes ago

      If we were talking about stuff like healthcare, food, housing, electricity, clean water, public transit, or access to information, I’d be on the same page.

      But this is a luxury hobby. And with luxury hobbies, there’s usually some flexibility. You don’t need a high-end PC to play games. You can run plenty on a lower-end setup, try different genres, or even step away from PC gaming altogether.

      You could have friends over for a tabletop game, go for a run, hit the gym, or try something like rock climbing. There are lots of ways to spend your time without needing top-tier gear

  • Nollij@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    5
    ·
    5 hours ago

    The biggest problem with DDR3 is that the last (consumer) boards/CPUs that could use it are really, REALLY old. 5th-gen Intel or AM3 AMD. Which means you’re looking at a full decade old, at the newest. These boards also probably can’t do more than 32GB.

    Now, I suppose if you only need 32GB RAM and a CPU that’s pathetic by modern standards, then this is a viable path. But that’s going to be a very small group of people.

    • mctoasterson@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      34 minutes ago

      Can confirm, I recently maxed out the RAM on my decade-old rig at 32GB. At least the used DDR3 RAM was cheap. With motherboards that old you are limited to processors like Intel Haswell with 4 cores, pretty anemic by today’s standards.

      It works just fine for me running Linux and doing minimal gaming. 90% of my gaming these days is on the SteamDeck anyway.

      I thought as I got older I would have more money to buy current gen PC parts and build basically whatever I wanted. Turns out priorities just shifted and things got even more expensive.

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      The list of vulnerability mitigations for those old CPUs is going to be a mile long. They will probably have their performance cut in half or worse. Even a much newer CPU like Zen 1 takes a big performance hit.

      You can disable mitigations, but then a malicious website could potentially steal sensitive information on that computer.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      56 minutes ago

      I’ve been doing active development for high processing stuff (computer vision and AI) on a Xeon 1230v5 (Skylake), 32GB of RAM, and a 1080ti up until a few months ago (before RAM prices skyrocketed). It was perfectly usable.

      The only place where it didn’t do well was in compile times and newer AAA games that were CPU bound. But for 99% of games it was fine.

      The only time I ran into RAM issues was when I had a lot of browser tabs open and multiple IDEs running. For gaming and any other non-dev task, 32GB is more than plenty.

    • 14th_cylon@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 hours ago

      These boards also probably can’t do more than 32GB.

      what is the difference between this and having new board, but not being able to afford that 32gb anyway?

    • humanamerican@lemmy.zip
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      5 hours ago

      I think this is actually most people. Power users and hardcore gamers are a relatively small portion of the PC market.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        As someone with a high end PC I can also spend a happy afternoon with my gameboy advance that has less than half a megabyte of RAM, so even in a power user and gamer context the hardware is what you make of it. There’s so much more out there than just the latest and most pathetically optimized titles.

      • dehyzer@piefed.social
        link
        fedilink
        English
        arrow-up
        14
        ·
        4 hours ago

        I would be surprised if this is still true, at least for home use. It seems like the non-gamer, non-power user segment of the PC market just switched over to tablets and smartphones instead. PCs and laptops just aren’t really necessary anymore for “normal” people who just want to check their email, watch YouTube, and surf the web.

        • HertzDentalBar@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          like this is anecdotal but most of my family has PC’s that are getting a bit long in the tooth but they still use it just fine for all the basic internet shit they do. Alot of folks would rather check their banking or emails on a bigger screen. My mom’s computer for example is almost 10 years old, if I throw Linux on it she’s good till the thing just up and dies.

          She asked about buying a new PC this year and I just laughed and said “no, you enjoy having a roof over your head right?”

          • village604@adultswim.fan
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            Yeah, my mom asked me for suggestions on a new computer since hers couldn’t do win11, so I just threw mint on it. She had no trouble making the switch.

        • humanamerican@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 hours ago

          I can see that eating into some PC use, but plenty of Millennials I know still prefer laptops or even desktops for casual use.

          • mycodesucks@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            I intentionally ignore the vast majority of everything on my phone until I can get to a real computer. Phones and tablets feel like unmitigated torture and I loathe it every time I have to use one to do something

        • B-TR3E@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          Non-gamers only. I recently replaced my mobo by a slightly older (the model, the board itself was brand new) industrial PC board. 32GB DDR3, NVidia Quadro K2200, 2 x gigabit ethernet, USB 3.1, five serial ports, three programmable digital IO ports, hardware watchdog, i7-4770 CPU @ 3.40GHz. It’s a Loonix machine and I don’t use it for gaming but I do a lot of animation, video editing, µcontroller programming and 3D-modelling with it. Super reliable, fast enough for most stuff. If I need more raytracing power, I just cluster it with my Lenovo p15.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        4 hours ago

        Non-power users would have no operating system, no Windows 11 support and grandma isn’t going to learn Linux

        • Romkslrqusz@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          4 hours ago

          Grandma doesn’t need to “learn” Linux

          Most of the older generation compute almost entirely through a web browser. They often struggle with the amount of notifications / solicitations that come up in a a Windows OS, as they can have trouble discerning between what is real and what is a scam - becoming fundamentally distrustful of everything as a result.

          Through my repair shop, I’ve transitioned plenty of older generation folks to Linux Mint with minimal friction.

          Main area where that can get a bit more complicated is for those who are clinging to an older piece of software they’re unwilling to let go of.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 hours ago

            I exclusively use Linux and have several family members who have Linux laptops.

            I don’t think it is impossible, but they require someone in their life that can handle the issues.

            They’re going to have a much harder time finding support for a Linux machine than a Windows machine.

            • Corkyskog@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 hours ago

              Some enterprising teenager should offer to upgrade peoples PCs to Linux, especially as Windows 11 is pushed harder. They could even offer a tech support option for a yearly fee.

        • humanamerican@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 hours ago

          That’s what the hardware requirement bypass and a techie friend are for.

          I manage a whole computer lab full of 3rd to 5th gen Intels with 8GB of RAM that run Windows 11 just fine.

    • Dran@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 hours ago

      There are server chips like the E7-8891 v3 which lived in a weird middle ground of supporting both ddr3 and ddr4. On paper, it’s about on par with a ryzen 5 5500 and they’re about $20 on US eBay. I’ve been toying with the idea of buying an aftermarket/used server board to see if it holds up the way it appears to on paper. $20 for a CPU (could even slot 2), $80 for a board, $40 for 32gb of ddr3 in quad chanel. ~$160 for a set of core components doesn’t seem that bad in modern times, especially if you can use quad/oct channel to offset the bandwidth difference between ddr3 and ddr4.

      I think finding a cooler and a case would be the hardest part

      • B-TR3E@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 hours ago

        These server boards are usually the same as scientific and engineering workstation boards. They’re pretty good if you put the right CPU in. Xeon or i7 4770 and you’ll get a quite useable workstation out of them.

  • Darkcoffee@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    61
    ·
    6 hours ago

    Ddr3 was kind of the point where the technology stopped incrementing with large jumps.

    Not saying ddr3 is as good as ddr4 or 5 but I used ddr3 until 2021 with no issue.

  • fyrilsol@kbin.melroy.org
    link
    fedilink
    arrow-up
    16
    ·
    6 hours ago

    I’m fine on DDR4. DDR5 feels to me, something I’ll get into in like 5 - 10 years from now. This is from someone who has sat on DDR2 and DDR3 machines for extended periods of time. If they’re still doing the job I want them to, no complaints.

  • qyron@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 hours ago

    I’m already considering building a maxed out AMD based machine, with DDR3.

    The last machine I had with that technology lasted me 12 years. I can vouch for it.

    • kadu@scribe.disroot.org
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 hours ago

      I trust DDR3 to last decades.

      DDR5? I’ve had three different sticks, from different brands, on different boards, die on me because of this stupid idea of adding the power delivery circuit in the RAM stick itself. So RAM manufacturers cheap out or don’t pay enough attention and your stick die, meanwhile, motherboard manufacturers have been dealing with multiple sensitive voltage rails for decades and have more than enough experience keeping them working.

      • Prove_your_argument@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        How very strange. I manage a deployment of hundreds of ddr5 based systems and have had no issues with failing ram. Not a single one.

        I have seen multiple consumer am5 motherboards with poor bios that fail to recognize ram and we’ve definitely seen stories of atypical processor failure rates in a handful of am5 boards by a couple of manufacturers. All of these things point to declining investment in motherboard design and testing by a couple of consumer motherboard brands rather than issues with modern silicon.

        • kadu@scribe.disroot.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          47 minutes ago

          The dead power management issue was more prevalent on the first generation of DDR5 sticks leaving the factories, sometimes with certain motherboard vendors (like Gigabyte) making the issue worse by using very aggressive “auto tuning” during memory training that never was quite within spec.

      • frongt@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Really? I’ve been managing a fleet of PCs at work with DDR5 for a few years now and haven’t noticed any memory issues.

        A couple motherboard and PSU replacements, but no memory failures.

      • qyron@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        I moved to a DDR4/AM4 platform when I assembled my current machine because the AM3 platform was being labelled as end of cycle and the FM segment seemed too niche.

        The scales tiped when I discovered many AM4 CPUs carried on chip graphic processing capabilities and being in need of a graphics card it was more affordable for me to just buy an APU than buy a CPU and add a GPU on top.

        Not being a gamer and a Linux user, throwing money on a graphics card, that by then were heavily price inflated, made little sense, so I opted by the AM4 platform.

        Currently, I’m considering building a machine capable of running Wasteland 2, because that game has been under my eye for years.

        I’m finding graphic cards with 4GB of memory on the market with very interesting prices. Used CPUs are cheap, unless I aim for the top tier models, with 6 or more cores. I still have the memory chips from the machine I retired (8GB) and getting an additional 8 is nothing out of reach. I just need to find a motherboard that can take 16GB or more of memory.

        If I can assemble a machine capable of running that game, I’m fairly confident the system itself will be more than enough to comply with my daily computing needs and then some.

        • rollin@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          a machine capable of running Wasteland 2

          Is there even such a machine on God’s good earth? It’s definitely a good game, but absolutely blighted by instability & CTDs last time I tried it a few years ago.

          • qyron@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            Don’t know. But thank you for the warning.

            GOG sent an email the other day warning the game was on discount and after taking another look at hardware the requirements it felt like a good benchmark for the technology of the time.

            It was heavy back then.

    • Know_not_Scotty_does@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      I had an 8350 machine with 32gb of ram when if was in season and while it never really left me short of power, the intel 4770k and 4790k were better performers. That may not be the case anymore with stuff being more multi-core optimized but at the time, the intel single core performance was so much better than the 8350s which made a bid difference in gaming.

      My old rig was an 8350 overclocked to 4.5 on liquid, crossfired 3gb 7950hd’s, and 32gb of matched corsair dominator ddr3 all in a corsair 230t chassis with the bright orange paint and led fans.

  • yeehaw@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    6 hours ago

    There’s so many good games made per year now it’s impossible to play them all so buckle up and start playing some older titles. I got into the Witcher 3 6 or 7 years after release and was blown away how I slept on that.

  • absquatulate@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 hours ago

    I mean DDR3 is provably fine. I ran a 16GB DDR3 machine with a goddamn 2500k up until several years ago and pre-2020 games usually ran fine, on playable framerates ( i did have win7, not sure how win10 fares ). Question is: who is this article for? Most tech enhusiasts have probably moved on by now, and even those are a small subset of PC users. “Normies”? Those moved on to phones and tables - it’s why MS Windows has lost 400million machines in 3 years. So who are all these people so left behind that DDR3 is an upgrade but are still currently itching to buy ram? I don’t get it.

  • chilicheeselies@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 hours ago

    When i looked for ddre mobos they were expensive af. Is it possible to use ddr3 in a ddr4 or 5 mobo? Is there an adapter or something?

  • einsteinntuli@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    4 hours ago

    Just dusted off my old desktop and set it up as a server.

    Glad I still have it. I might buy more DDR3 if I need it. I’m sorry for those who don’t have a CPU/motherboard already to support it.