Intel and their last couple of processor generations were a failure. AMD, on the other hand, been consistent. Look at all these tiny AMD APUs that can run C2077 on a 35W computer that fits in the palm of a hand? Valve is about to drop a nuclear bomb on nvidia, intel and microslop with Gabecube.
So the editor asked AI to come up with an image for the title “Gamers desert Intel in droves” and so we get a half-baked pic of a CPU in the desert.
Am I close?
Looks like bad photoshop more than AI
Could be worse.
Could have been “gamers dessert Intel in droves”
Now I want to see that one. But, I refuse to use online generative AI.
So happy I chose to go with AM4 board years ago. Was able to go from Zen+ CPU to X3D CPU.
I remember people said back then people usually don’t upgrade their CPU, so its not that much a selling point. But, people didn’t upgrade because they couldn’t due to constant socket changes on the Intel side.
My fps numbers were very happy after the CPU upgrade, and I didn’t have to get a new board and new set of ram.
Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.
Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel’s chips still do.
Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.
cached in their lead
There are so many dimensions to this
Don’t forget the awfully fast socket changes
And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.
In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn’t fail.
what was the issue?
It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel’s XTU to make things stable again.
This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.
When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.
Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?
Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren’t that many of those

Well this scheme seems much more reasonable and logical to me.
I just read the other day that at least one motheboard manufacturer is bringing back AM4 since DDR4 is getting cheaper than DDR5, even with the “this isn’t even manufactured anymore” price markup. That’s only even possible because of how much long-term support AMD gave that socket.
Even within the same socket family, looking at you lga1151, can you run into compatibility problems.
I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it’s AM6. What came after the Intel LGA1151? It wasn’t LGA1152.
Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.
remember Socket 7?
Holy shit, crosscompatibility between manufacturers? We came this close to the almighty above and still ended up where we are today 🤦♂️
I remember Slot 2
AMD tried the Intel thing too by stopping support of past generation CPU on latter AM4 boards though. Only after public outcry did they scrap that. Wouldn’t put it past them to try it again on AM5.
Are there a lot of people wanting to plug Zen 1 chips into B550 motherboards? Usually it’s the other way around, upgrading chip in an old motherboard.
It can happen if the old motherboard failed, which was more likely than the CPU failing.
There was talk of not providing firmware update for old chipsets to support new gen CPU as well, which is relevant to the cases you mentioned.
As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn’t help.
It wasn’t arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer
Which is a pretty arbitrary naming convention since the number of pins in a socket doesn’t really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.
deleted by creator
They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.
Remember buying the 2600(maybe X) and it was soo fast.
The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.
Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.
Coincidentally, that’s the exact cpu I use in my server! And it runs pretty damn well.
At this point the only “issue” with it is power usage versus processing capability. Newer chips can do the same with less power.
Yeahhh, iirc it uses slightly less power than my main cpu for significantly less performance
Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.
Past me made the accidentally more financially prudent move of opting for the i7-4790k over the i5-4690k which ultimately lasted me nearly a decade. At the time the advice was of course “4 cores is all you need, don’t waste the money on an i7” but those 4 extra threads made all the difference in the longevity of that PC
All of the exploits against Intel processors didn’t help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.
Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips…
Only one affected AMD, forget which. But Intel knew about the vulnerabilities, but chose not to fix the hardware ahead of their release.
Yea that definitely sounds like Intel… Though it’s still worth pointing out that one of them was a novel way to spy on program memory that affects many CPU types and not really indicative of a dropped ball. (outside of shipping with known vulnerabilities, anyways)
… The power stuff from 12/13th gens or what ever though… ouch, massive dropped ball.
Even the 6-core Phenom IIs from 2010 were great value.
But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.
Intel until they realized that other companies made CPUs, too

They also bring a “dying transitor problem we don’t feel like fixing” to the party, too
And a constantly changing socket so you have to get a new motherboard every time.
Honestly, not a big deal if you build PC’s to last 6-7 years, since you will be targeting a new RAM generation every time.
Upgraded from a 1600 to a 5600, same mobo
If only your CPU becomes a limiting factor at one point you can simply upgrade your CPU to a few generations newer cpu without having to swap out your motherboard. You can’t really do that with Intel (AFAIK they switch platforms every 2 CPU generations so depending on your CPU you may not be table to upgrade at all (can happen with AMD too, but not that frequent)
With no multi threading
I mean the i7s had SMT. You had to pay extra for SMT, whereas AMD started giving it to you on every SKU except a few low-end ones.
Is it true that all of them had SMT but they just locked it away for lower tiers processors and some managed to activate it despite Intel’s effort?
I have to lower my 12th Gen cpu multiplier to stop constant crashing when playing UE games, because everything is overlooked at the factory so they could keep up with AMD performance. Fuck Intel.
Worse product and worse consumer practices (changing sockets every 2 generations) made it an easy choice to go with AMD.
DDR4 compatibility held on for a while though after AM5 was full DDR5.
The only real issue they had which has led to the current dire straits is the 13th/14th gen gradual failures from power/heat which they initially tried to claim didn’t exist. If that didn’t happen AMD would still have next to no market share.
You still find people swearing up and down that intel is the only way to go, even despite the true stagnation of progress on the processor side for a long, long time. A couple of cherry picked benchmarks where they lead by a miniscule amount is all they care about, scheduling / parking issues be damned.
Oh hell naw, the issues with Intel came up much sooner.
Ever since Ryzen came out, Intel just stagnated.
I don’t disagree that intel has been shit for a long time, but they were still the go to recommendation all the way through the 14th gen. It wasn’t until the 5800x3d came along that people started really looking at AMD for gaming… and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.
I had a 5800x because I didn’t want yet another intel rig after a 4790k. Then I went on to the 5800x3d, before the 9800x3d now. The 5800x was behind intel, and for me it was just a stopgap anyway because a 5950x was not purchasable when I was building. It was just good enough.
As someone who lived through the fTPM firmware issue on AM4… I can confidently state that the tpm freezes were a dealbreaker. If you didn’t use fTPM and had the module disabled, or you updated your firmware after release you were fine - but the ftpm bug was for many, MANY years unsolved. It persisted for multiple generations. You could randomly freeze for a few seconds in any game (or any software) at any time… sometimes only once every few hours, sometimes multiple times in the span of a few minutes. That’s not usable by any stretch for gaming or anything important.
and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.
strongly disagree. Prebuilds are mostly overpriced and/or have cheap components and in worst case proprietary connectors.
I build for the best bang for bucks, and at least in my bubble so do others.
Somehow I think you misunderstood my meaning.
Prebuilt have all kinds of hardware and unfortunately many users go with those. I offered to do a 265k 5070ti build for my brother’s girlfriend but he instead spent the same amount on a 265k 5070 32gb 5200mhz prebuilt. He does some dev work and she does a tiny amount of creative work and honestly I think he wanted to make sure her system was inferior to his. 1 year warranty and you have to pay to ship the whole system in if there’s any issues. He wouldn’t even consider AMD or going with a custom build like I do for myself and others (just finished another intel build over the weekend for a coworker, diehard intel even after the issues…)
In the custom build world I think you find more gamers and people who want the fastest gear they can afford, which is why we see gamers picking up AMD x3d chips today. They aren’t beaten and aren’t just the most expensive option.
AM5 as a platform still has issues with memory training, though it’s largely set it and forget it until you reboot after a month or dont have memory context restore enabled in bios.
I’m less familiar with the intel side nowadays despite literally just doing a build. They seem to win on boot times unless you accept the instability of AMD’s fast boot memory check bypass stuff. Getting a government bailout though is enough to make me want to avoid them indefinitely for my own gear so I doubt I’ll get much hands on with the current or next gen.
I’ve had AMDs since forever, my first own build with Phenom II.
They were always good, but Ryzens were just best.
Never used TPM, so can’t comment on that. And most people never used it,
But yes, so many hardcore Intel diehards, it’s almost funny if it wasn’t sad. Like Intels legacy of adding wattage to get nothing in return.
This might be true for the top of the line builds, but for any build from budget to just below that Ryzen has been a good and commonly recommended choice for a long time
Just upgraded from an i7-6600k to an RX 7800x3D. Obviously a big upgrade no matter if I went AMD or Intel but I’m loving this new CPU. I had an AMD Athlon XP in the early 2000’s that was excellent so I’ve always had a positive feeling towards AMD.
AMD has had a history of some pretty stellar chips, imo. The fx series just absolutelty sucked and tarnished their reputation for a long time. My Phenom II x6, though? Whew that thing kicked ass.
Oh yeah I had one of those before my 4790k
Intel Pentium D era sucked compared to the Athlon 64 II x2 from what I remember. I had an Athlon 64 3000+ just before the dual core era. Athlon 64 era was great
I played through mass effect 3 when it was new on a discount AMD laptop with an igpu. Granted it was definitely not on max setting, but it wasn’t with everything turned all the way down either.
I’ve been buying AMD since the K6-2, because AMD almost always had the better price/performance ratio (as opposed to outright top performance) and, almost as importantly, because I liked supporting the underdog.
That means it was folks like me who helped keep AMD in business long enough to catch up with and then pass Intel. You’re welcome.
It also means I recently bought my first Intel product in decades, an Arc GPU. Weird that it’s the underdog now, LOL.
AMD almost always had the better price/performance
Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.
To be fair, I upgraded my main desktop directly from a Phenom II X4 840(?) to a Ryzen 1700x without owning any Bulldozer stuff in between.
(I did later buy a couple of used Opteron 6272s, but that’s different for multiple reasons.)
I’ve got an FX 8350, sure AMD fell behind during that time but it was by no means a bad CPU imo. Main PC’s got a 7800X3D now but my FX system is still working just fine to this day, especially since upgrading to an SSD and 16GB RAM some years ago. It can technically even run Cyberpunk 2077 with console like frame rates on high settings.
I mean… It functioned as a CPU.
But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.
Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:
It was bad.
Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.
Bulldozer was AMD’s Pentium 4.
I’ve been buying since the the Phenom II days with the X3 720. One could easily unlock their 4th core for an easy performance boost. Most of the time it’d work without a hassle.
Wish I knew about that trick back then! I shelled out for an X4…
My first AMD was a 386-40. Had several of their CPUs since. But there were a few years there that it was real tough to pick AMD.
Ohh yeah, I remember now :D that was pretty cool, I felt like a wizard!
I’ve had the same approach as you. Only one time I’ve bought Intel, I’ve had this feeling that it just didn’t perform well enough to justify the price. Never regretted AMD, especially the last one which basically made me abandon discrete GPU whatsoever lol.
Love my 3DNow! K6-2, also my starter.
Oh man, I’d forgotten all about 3dnow!
I decide every upgrade which one to go with. Try not to stay dedicated to one.
Basically - Buy Intel cause it’s the best last I checked… Oh, that was two years ago, now AMD should have been the right one.
Next upgrade, won’t make that mistake - buy AMD. Shit… AMD is garbage this gen, shoulda gotten Intel. Ok, I’ll know better next upgrade.
Repeat forever.
TBF, AMD has been pretty rock-solid for CPUs for the last 5-6 years. Intel… not so much.
My last two computers have been AMD, the last time I built an Intel system was ~2016
I switched to AMD because of Intel’s chip stability issues. No problems since
When I updated my wife’s computer for Windows 11 I went AMD for that reason as well. They released 2 generations in a row with now well-documented hardware bugs that slowly kill the processors. 13th and 14th gen CPUs simply will have zero resale value if they last long enough to hit the second hand market. I briefly worked at an MSP at the beginning of last year and the amount of gaming computers that came in via noncommercial walk-in customers for stability issues that ultimately turned out to be the Intel CPU bugs was incredible
I initially got the 13 series, got a 14 series as the warranty replacement. Even with updated bios firmware the 14 series also began suffering from the same instability issues, sooner than the 13. Switched after that because two chips in a row, different generations is not a small mistake. Just happy I didn’t have to pay for that 14 series only to see it have the same problem
That is insane. I know that’s what’s been happening because I’ve both seen it in the news and in the real world through work yet I still struggle to comprehend that this is what’s actually happening with these processors
Yeah it was highly disappointing. I’d always used Intel CPUs just because I picked one chipset and stuck with it. I even put up with the instability issues from the 13 for a while. At first I figured something else in the PC was dying on me. Until it reached a point where I literally couldn’t run certain applications because it would always crash that the news articles started coming out about the chip issues.
The United States government owns 10% of Intel now.
I know we shouldn’t have brand loyalty, but after the near decade of quad core only CPUs from Intel, I can’t help but feel absolute hate towards them as a company.
I had a 3770k until AMD released their Ryzen 1000 series and I immediately jumped over, and within the next generation Intel started releasing 8 core desktop cpus with zero issues.
I haven’t bought anything Intel since my 3770k and I don’t think I ever will going forward.
The 3770k was legendary. I used it for so long. I upgraded to a 7600k almost a decade ago and now just ordered my first AMD chip (Ryzen 9700X). The Intel chips were solid, did so long with them, I hope this AMD system will last as long.
Yep, I kept the 3770k until I bought a 7800x3d. It lasted that long, and I gave my son the 3770k system and it was still overkill to play the games he wanted. Rocket League, Minecraft, fortnite etc…
I still have my 3770k but it’s in storage.
I bought a 1700X and was using that until upgrading to a 3700X, which I’m still using today in my main gaming desktop.
I think you’ll be fine!
7700k here I will upgrade from (likely to AMD) one day. But still almost zero reason to.
The last Intel I bought new was the Pentium 4 630. 3.0 Ghz, with hyperthreading. That’s thing was a fucking space heater. And I loved it. But everything new since then has been AMD.
I remember, it was a huge issue for programs. Developers were just not supporting other chipsets because Intel was faster than the competition and mostly cheaper. Then they got more expensive, did some shitty business to MINIX and stayed the same speed wise.
So now we see what actual competition does.
I do want them to stay alive and sort themselves out though. Otherwise in a few years it will be AMD who will start outputting overpriced crap and this time there will be no alternative on the market.
They’re already not interested in seriously putting competitive pressure on NVidia’s historically high GPU prices.
I’m personally hoping more 3rd parties start making affordable RISC V. But yeah I agree, having Intel stick aroundbwould be good for people as you said.
Not only that, but (as an American) I do want the US to have some fab capability. A strong Intel is in our national security interest.
Yeah, if Taiwan is ever invaded, having US-based fabs will be crucial for world supply. Absolutely want to see Intel survive and TSMC continue to build factories here.
Nothing would say ‘get fucked’ like Intel going belly up and Taiwan exploding. The supply of any new computer parts would be a dumpster fire for years.
Texas instruments computers lol
One can only dream about people fleeing x86-64 and going ARM or, even better, RISC-V.
But no, it’s only changing the collar to the dog. But the dog stays the same.
Why though? X Elite lags x86 on battery life, performance and compatibility (and you can’t really run Linux on X Elite).
I am not a fan of Intel, AMD, Nvidia, but what’s the point of moving to ARM for the sake of moving?
Unlike most, I actually have been running ARM on home server for almost a decade. For that use case it makes sense because it’s cheap and well supported.
It would be better to switch to RISC-V because it has no problems with patents and everyone can build a RISC-V CPU, not only 2 companies.
I would be happy to, but it’s currently not an option for desktop/laptop.
Would be great for an SBC where the OS and apps are open source and performance is less of an issue.
ARM has all the same drawbacks as x86 and it’s not a Deus Ex machina that gives high performance at low power consumption because of magic.
Imagine Europe pushing RISC-V and sharing upgrades with China¹. The power of the flagship would soon reach ARM or even x86-64 in a few years.
¹ China is already using RISC-V as much as they can.
I would support that, but it would require European unity and a strategic decision to make a permanent break with the US.




















