some people (…) are asking “can you game on DDR3“? The answer is a shocking yes.
“shocking”. Really?
Browsing the internet as a third worlder always give me these eye-rolling moments. Sigh…
Everything’s shocking, under-rated and or being blasted these days.
You really slammed em with that one
It’s all just one big ass blast.
The question is for companies like Ubisoft and EA which usually design games for what PCs are going to be when a game comes out. And since the games industry was bigger than the movies industry before it collapsed due to Covid, what’s that going to do to the economy?
“Can’t compete with the global super rich? Lower your standards and be happy!”
Just because they’ve trained you to believe you need the latest 2nm chips (which is conveniently their highest margin product) doesn’t mean you really need them.
If we were talking about stuff like healthcare, food, housing, electricity, clean water, public transit, or access to information, I’d be on the same page.
But this is a luxury hobby. And with luxury hobbies, there’s usually some flexibility. You don’t need a high-end PC to play games. You can run plenty on a lower-end setup, try different genres, or even step away from PC gaming altogether.
You could have friends over for a tabletop game, go for a run, hit the gym, or try something like rock climbing. There are lots of ways to spend your time without needing top-tier gear
This is how existence works, yes. Being happy means adjusting your wants to what you have.
“I’m just saying they don’t need to have 30 dolls. They can have three. They don’t need to have 250 pencils. They can have five.”
Being happy means adjusting your wants to what you have.
Oh I guess I should be happy that ICE only raided my neighbors and not me. Amirite?
Well…yes? Are you not happy you did not get raided? Would you rather have had that you also got raided?
💩 take
Fuck Buddhism, amirite?
If you’re upset with my feedback, adjust your expectations.
The biggest problem with DDR3 is that the last (consumer) boards/CPUs that could use it are really, REALLY old. 5th-gen Intel or AM3 AMD. Which means you’re looking at a full decade old, at the newest. These boards also probably can’t do more than 32GB.
Now, I suppose if you only need 32GB RAM and a CPU that’s pathetic by modern standards, then this is a viable path. But that’s going to be a very small group of people.
Can confirm, I recently maxed out the RAM on my decade-old rig at 32GB. At least the used DDR3 RAM was cheap. With motherboards that old you are limited to processors like Intel Haswell with 4 cores, pretty anemic by today’s standards.
It works just fine for me running Linux and doing minimal gaming. 90% of my gaming these days is on the SteamDeck anyway.
I thought as I got older I would have more money to buy current gen PC parts and build basically whatever I wanted. Turns out priorities just shifted and things got even more expensive.
The list of vulnerability mitigations for those old CPUs is going to be a mile long. They will probably have their performance cut in half or worse. Even a much newer CPU like Zen 1 takes a big performance hit.
You can disable mitigations, but then a malicious website could potentially steal sensitive information on that computer.
I’ve been doing active development for high processing stuff (computer vision and AI) on a Xeon 1230v5 (Skylake), 32GB of RAM, and a 1080ti up until a few months ago (before RAM prices skyrocketed). It was perfectly usable.
The only place where it didn’t do well was in compile times and newer AAA games that were CPU bound. But for 99% of games it was fine.
The only time I ran into RAM issues was when I had a lot of browser tabs open and multiple IDEs running. For gaming and any other non-dev task, 32GB is more than plenty.
These boards also probably can’t do more than 32GB.
what is the difference between this and having new board, but not being able to afford that 32gb anyway?
I think this is actually most people. Power users and hardcore gamers are a relatively small portion of the PC market.
As someone with a high end PC I can also spend a happy afternoon with my gameboy advance that has less than half a megabyte of RAM, so even in a power user and gamer context the hardware is what you make of it. There’s so much more out there than just the latest and most pathetically optimized titles.
I would be surprised if this is still true, at least for home use. It seems like the non-gamer, non-power user segment of the PC market just switched over to tablets and smartphones instead. PCs and laptops just aren’t really necessary anymore for “normal” people who just want to check their email, watch YouTube, and surf the web.
like this is anecdotal but most of my family has PC’s that are getting a bit long in the tooth but they still use it just fine for all the basic internet shit they do. Alot of folks would rather check their banking or emails on a bigger screen. My mom’s computer for example is almost 10 years old, if I throw Linux on it she’s good till the thing just up and dies.
She asked about buying a new PC this year and I just laughed and said “no, you enjoy having a roof over your head right?”
Yeah, my mom asked me for suggestions on a new computer since hers couldn’t do win11, so I just threw mint on it. She had no trouble making the switch.
I can see that eating into some PC use, but plenty of Millennials I know still prefer laptops or even desktops for casual use.
I intentionally ignore the vast majority of everything on my phone until I can get to a real computer. Phones and tablets feel like unmitigated torture and I loathe it every time I have to use one to do something
Non-gamers only. I recently replaced my mobo by a slightly older (the model, the board itself was brand new) industrial PC board. 32GB DDR3, NVidia Quadro K2200, 2 x gigabit ethernet, USB 3.1, five serial ports, three programmable digital IO ports, hardware watchdog, i7-4770 CPU @ 3.40GHz. It’s a Loonix machine and I don’t use it for gaming but I do a lot of animation, video editing, µcontroller programming and 3D-modelling with it. Super reliable, fast enough for most stuff. If I need more raytracing power, I just cluster it with my Lenovo p15.
Non-power users would have no operating system, no Windows 11 support and grandma isn’t going to learn Linux
Grandma doesn’t need to “learn” Linux
Most of the older generation compute almost entirely through a web browser. They often struggle with the amount of notifications / solicitations that come up in a a Windows OS, as they can have trouble discerning between what is real and what is a scam - becoming fundamentally distrustful of everything as a result.
Through my repair shop, I’ve transitioned plenty of older generation folks to Linux Mint with minimal friction.
Main area where that can get a bit more complicated is for those who are clinging to an older piece of software they’re unwilling to let go of.
I exclusively use Linux and have several family members who have Linux laptops.
I don’t think it is impossible, but they require someone in their life that can handle the issues.
They’re going to have a much harder time finding support for a Linux machine than a Windows machine.
Some enterprising teenager should offer to upgrade peoples PCs to Linux, especially as Windows 11 is pushed harder. They could even offer a tech support option for a yearly fee.
That’s what the hardware requirement bypass and a techie friend are for.
I manage a whole computer lab full of 3rd to 5th gen Intels with 8GB of RAM that run Windows 11 just fine.
deleted by creator
There are server chips like the E7-8891 v3 which lived in a weird middle ground of supporting both ddr3 and ddr4. On paper, it’s about on par with a ryzen 5 5500 and they’re about $20 on US eBay. I’ve been toying with the idea of buying an aftermarket/used server board to see if it holds up the way it appears to on paper. $20 for a CPU (could even slot 2), $80 for a board, $40 for 32gb of ddr3 in quad chanel. ~$160 for a set of core components doesn’t seem that bad in modern times, especially if you can use quad/oct channel to offset the bandwidth difference between ddr3 and ddr4.
I think finding a cooler and a case would be the hardest part
These server boards are usually the same as scientific and engineering workstation boards. They’re pretty good if you put the right CPU in. Xeon or i7 4770 and you’ll get a quite useable workstation out of them.
Its been good for my homelab
Ddr3 was kind of the point where the technology stopped incrementing with large jumps.
Not saying ddr3 is as good as ddr4 or 5 but I used ddr3 until 2021 with no issue.
Same but 2024. I missed all of DDR4. Jumped straight from 3 to 5.
I’ve noticed my ram speed much less than the amount of ram for quite some time.
SSDs were game changers.
I’m fine on DDR4. DDR5 feels to me, something I’ll get into in like 5 - 10 years from now. This is from someone who has sat on DDR2 and DDR3 machines for extended periods of time. If they’re still doing the job I want them to, no complaints.
I’m about to go dumpster diving for ram or some shit, holy fuck the prices are fucked
I’m already considering building a maxed out AMD based machine, with DDR3.
The last machine I had with that technology lasted me 12 years. I can vouch for it.
I trust DDR3 to last decades.
DDR5? I’ve had three different sticks, from different brands, on different boards, die on me because of this stupid idea of adding the power delivery circuit in the RAM stick itself. So RAM manufacturers cheap out or don’t pay enough attention and your stick die, meanwhile, motherboard manufacturers have been dealing with multiple sensitive voltage rails for decades and have more than enough experience keeping them working.
How very strange. I manage a deployment of hundreds of ddr5 based systems and have had no issues with failing ram. Not a single one.
I have seen multiple consumer am5 motherboards with poor bios that fail to recognize ram and we’ve definitely seen stories of atypical processor failure rates in a handful of am5 boards by a couple of manufacturers. All of these things point to declining investment in motherboard design and testing by a couple of consumer motherboard brands rather than issues with modern silicon.
The dead power management issue was more prevalent on the first generation of DDR5 sticks leaving the factories, sometimes with certain motherboard vendors (like Gigabyte) making the issue worse by using very aggressive “auto tuning” during memory training that never was quite within spec.
Really? I’ve been managing a fleet of PCs at work with DDR5 for a few years now and haven’t noticed any memory issues.
A couple motherboard and PSU replacements, but no memory failures.
I moved to a DDR4/AM4 platform when I assembled my current machine because the AM3 platform was being labelled as end of cycle and the FM segment seemed too niche.
The scales tiped when I discovered many AM4 CPUs carried on chip graphic processing capabilities and being in need of a graphics card it was more affordable for me to just buy an APU than buy a CPU and add a GPU on top.
Not being a gamer and a Linux user, throwing money on a graphics card, that by then were heavily price inflated, made little sense, so I opted by the AM4 platform.
Currently, I’m considering building a machine capable of running Wasteland 2, because that game has been under my eye for years.
I’m finding graphic cards with 4GB of memory on the market with very interesting prices. Used CPUs are cheap, unless I aim for the top tier models, with 6 or more cores. I still have the memory chips from the machine I retired (8GB) and getting an additional 8 is nothing out of reach. I just need to find a motherboard that can take 16GB or more of memory.
If I can assemble a machine capable of running that game, I’m fairly confident the system itself will be more than enough to comply with my daily computing needs and then some.
a machine capable of running Wasteland 2
Is there even such a machine on God’s good earth? It’s definitely a good game, but absolutely blighted by instability & CTDs last time I tried it a few years ago.
Don’t know. But thank you for the warning.
GOG sent an email the other day warning the game was on discount and after taking another look at hardware the requirements it felt like a good benchmark for the technology of the time.
It was heavy back then.
I had an 8350 machine with 32gb of ram when if was in season and while it never really left me short of power, the intel 4770k and 4790k were better performers. That may not be the case anymore with stuff being more multi-core optimized but at the time, the intel single core performance was so much better than the 8350s which made a bid difference in gaming.
My old rig was an 8350 overclocked to 4.5 on liquid, crossfired 3gb 7950hd’s, and 32gb of matched corsair dominator ddr3 all in a corsair 230t chassis with the bright orange paint and led fans.
I never left DDR3. Still never upgraded from an FX-8350.
There’s so many good games made per year now it’s impossible to play them all so buckle up and start playing some older titles. I got into the Witcher 3 6 or 7 years after release and was blown away how I slept on that.
I mean DDR3 is provably fine. I ran a 16GB DDR3 machine with a goddamn 2500k up until several years ago and pre-2020 games usually ran fine, on playable framerates ( i did have win7, not sure how win10 fares ). Question is: who is this article for? Most tech enhusiasts have probably moved on by now, and even those are a small subset of PC users. “Normies”? Those moved on to phones and tables - it’s why MS Windows has lost 400million machines in 3 years. So who are all these people so left behind that DDR3 is an upgrade but are still currently itching to buy ram? I don’t get it.
When i looked for ddre mobos they were expensive af. Is it possible to use ddr3 in a ddr4 or 5 mobo? Is there an adapter or something?
No
Are you looking at new? Look at used, off eBay or whatever is in your country.
But everything is more expensive right now anyway.
Just dusted off my old desktop and set it up as a server.
Glad I still have it. I might buy more DDR3 if I need it. I’m sorry for those who don’t have a CPU/motherboard already to support it.










