

So it makes me wonder how much Valve is paying them for support since the upcoming Steam Frame uses a Qualcomm Snapdragon 8 CPU and is also running Steam OS which is just a fork of Arch.


So it makes me wonder how much Valve is paying them for support since the upcoming Steam Frame uses a Qualcomm Snapdragon 8 CPU and is also running Steam OS which is just a fork of Arch.


Agentic AI is just a buzzword for letting AI do things without human supervision. It’s absolutely a recipe for disaster. You should never let AI do anything you can’t easily undo as it’s guaranteed to screw it up at least part of the time. When all it’s screwing up is telling you that glue would make an excellent topping for pizza that’s one thing, but when it’s emailing your boss that he’s a piece of crap that’s an entirely different scenario.


Thinking about investing in new AI IPOs?
Not even remotely.


Windows will be the default until suddenly it isn’t. Valve is doing amazing at destroying the core of Microsoft’s support. This story would be different if this was a decade ago, but these days most average people do their computing on phones and tablets. The ones sticking to traditional PCs are mostly gamers and now more than ever Linux is a viable alternative to Windows. Vanishingly few games can’t be played perfectly fine on Linux. Once enough gamers are using Linux it will become the default choice, and once it’s the default choice for gamers it will become the default choice for most people, at least the ones not on phones and tablets.


It’s very popular to the point where multiple other distros are starting to offer its patched kernel on their distro. It’s very focused on gaming performance, particularly around Steam and Proton.


Yeah that should be completely fine then. Try dual boot, if you don’t have any issues you can always go 100% Linux at some point in the future and in the meantime the old Windows partition can provide some amount of reassurance if something does go wrong.


Is it a newer Nvidia GPU? If so I believe it pretty much works the same these days. It was mostly the older Nvidia GPUs that seemed to have a lot of problems.


Cachyos seems like the general recommendation. Haven’t used it myself, but I’ve used its kernel so I guess that counts for something.


They don’t even know how to use a gamepad nevermind a keyboard and mouse. If it doesn’t have a touchscreen and big shiny icons it’s too complicated for them. One step closer to Idiocracy.


Yes, but it’s also 20 years old now and has been discontinued for almost a decade. Likewise the Wii had a PowerPC CPU in it. None of the current consoles use PowerPC, They’re either x86 (Xbox One and PlayStation 5) or ARM (Switch 2).


Amiga is owned by another company.
Kind of. Apparently the rights are a mess and owned by 3 different companies, one of which is Commodore, although it seems like the current version of AmigaOS is owned by a different company.
The most recent version of AmigaOS is 4.1 which was released in 2014, and requires a PowerPC CPU. It’s kind of hard to argue that’s a modern OS, although apparently a 4.2 release is in the works. The dependency on PowerPC is kind of a problem at this point as their CPUs have stagnated and it’s hard to find any modern ones that aren’t custom CPUs for game consoles (and even then mostly old game consoles).
Additionally there’s the problem of software availability. The new Commodore OS is just a tweaked Linux install so it gets all the Linux software essentially for free. AmigaOS on the other hand is legitimately its own OS and therefore only runs Amiga software.


Really I had two issues with the interview.
First about half of it is spent talking about AI garbage that’s irrelevant to pretty much everything. Their argument is essentially “the current off the shelf AI setups are built with ARM chips as their general purpose compute tying together the specialized accelerators doing the actual work” which might be true but doesn’t explain why that should continue to be the case. Sort of a correlation does not equal causation type thing.
Secondly, for like 99% of the companies out there doing cloud deployments this is all utterly irrelevant. Most businesses aren’t hyper focused on shaving clock cycles to the point where they’re obsessing about microarchitecture decisions impacting performance. The reality is for 99% of services I/O is going to be your bottleneck and no amount of twiddling with the CPU architecture is going to improve that in a meaningful fashion, and for the overwhelming majority of customers it doesn’t matter in the slightest. Sure your Amazons and Googles and maybe the fintech sector might care, but for your Walmarts and Bass Pro Shops it’s utterly irrelevant except maybe to shave some cost off a slightly cheaper AWS deployment.
As for the consumer market this is even more irrelevant. If you’re not in the market for an EPYC server currently none of this matters to you, which is a shame because the success of Apple with their ARM CPUs provides an opportunity to have a potentially interesting discussion about the relative technical merits of X86 vs. ARM and maybe even RISC-V. Technical merits this interview doesn’t really touch on either, it’s almost entirely a market focused piece with very little in terms of concrete “ARM beats x86 in this way” outside of a vague hand wavy “it has a more consistent micro architecture”.


Seems like a lot of the early LTT crew were chaffing a bit under the LTT contract for a variety of reasons and opted to leave and start their own channels. I hope most of them succeed because honestly I always liked the other hosts on LTT far more than Linus who usually came off as more comic relief than actual tech news. As the old technical crew left I’ve found myself watching Linus Drop Tips very rarely.


Fully expecting chip makers to just manufacture a bunch of cheap garbage here that eventually ends up in a landfill in order to avoid the tariffs on their expensive chips. With a 100% tariff and prices per-chip in the $300+ range if they can make a chip for less than $300 even if they immediately chuck it in the garbage they’re saving money. Imagine a whole tray of cheap 500nm chips that are full of defects because they were made with wafers that failed QA. They manufacture them, document that they exist, dump them in a bin in a warehouse, then just throw the bin away in a year as unsold inventory and write it off as a business loss.


Reported view counts are also important for sponsorships as sponsored video payouts are often tied to hitting specific view counts, and even getting sponsorships and their rates are also typically conditional on view counts. So yes, even though it doesn’t directly impact ad revenue it still directly impacts total channel revenue for anyone that accepts sponsorships.
All that said, Google caused this entire mess by bundling their view counting in with their telemetry. If they just reported the raw download stats for the streams instead of trying to determine every last detail of who is watching the video (for all that juicy advertising data) this problem wouldn’t have happened in the first place.


Basically Youtube instead of counting views via actual requests for the videos instead uses a separate call that essentially says “hey, someone watched this video”. All the ad blockers rather than use a hard coded list of URLs to block which would quickly go stale instead use one of a couple different 3rd party lists the most popular of which is EasyList. EasyList decided to block the URL that youtube uses to register views on the principal that it was a privacy violation because it not only registers “hey someone watched this” but also captures exactly who watched it which allows Google to track your viewing habits.


Literally the only reason old school gamers play on CRTs is because old games were designed for the blurry low resolution displays they provided and so look kind of bad on modern crisp displays. You could just smear vasoline on a modern LCD and get roughly the same effect, but using a CRT is less messy.


It’s also utter garbage. We abandoned CRTs because they sucked. They’re heavy, waste tons of space, guzzle power, and have terrible resolution. Even the best CRT ever made is absolutely destroyed by the worst of modern LCDs. The only advantage you could possibly come up with is that in an emergency you could beat someone to death with a CRT. Well, that and the resolution was so garbage they had a natural form of antialiasing, but that’s a really optimistic way of saying they were blurry as shit.


This is such a strange concept. Like fundamentally a subscription is just a mechanism to allow a viewer to easily keep track of new content on a channel. By viewing the channels contents you’re engaging in 100% of the interaction you should be expected to have with a subscribed channel. If Google really wanted to address the problem of old subscriptions people are ignoring they should just prompt people to unsubscribe to channels that they haven’t watched any videos from in a long time. Instead they’re fucking with view counts because that saves them money. The whole thing is fishy, but Google has always treated being inscrutable and capricious as if those were virtues.
So the way the statement about Qualcomm supporting Linux was phrased made it seem like a blanket statement rather than referring to specifically the X1 Elite. The fact that Qualcomm’s Linux support seems to vary wildly based on the specific CPU is interesting and suggests it’s less about the CPU or Linux and more about the visibility and importance of the companies using that CPU. The X1 Elite got first class Windows support (although it sounds like only some specific laptops did) because certain large manufacturers were using it. Likewise the 8 Elite Gen 5 is getting first class Linux support because Valve is using it in a high visibility project.
If there’s a silver lining to this it sounds like Valve is doing the right thing by the FOSS community and is paying to have a company contribute bug fixes and improvements to the Vulkan drivers and FEX project for ARM in general and for this specific CPU. That combined with Qualcomm themselves wanting to look good and provide support should mean at least this CPU should work very well in Linux, and maybe that will also make it a little easier to support other Qualcomm CPUs as well. It’s just a shame that that level of Linux support by Qualcomm doesn’t extend to all their products.