• 16 Posts
  • 928 Comments
Joined 2 years ago
cake
Cake day: October 20th, 2023

help-circle
  • I’ll also add on that there are a LOT of blog posts and youtube shorts about “Game X is 20% faster on Linux than Windows!!!” that everyone loves to regurgitate. And the reality is that it was a single outlier or it all boils down to Steam distributing “good enough” shaders to Linux but not Windows (and let’s not get into the weeds of why).

    Whereas GN, especially since “All New Data” a few years back, have very heavily focused on reproducible and “good” data. That is why Steve basically apologized for not having error bars or having what looks like messy data for a few of those runs. And they’ve done entire videos on their testing methodology that often includes MANY runs to normalize out the noise.

    So without being able to explain exactly why? I doubt they will EVER put Windows and Linux data on even the same page of their website. But… someone who cares will be able to see trends.


  • And you can similarly do most/all of your dev work in a container that you spin up with a podman alias (fuck hashicorp with a rusty metal pole but damn if Vagrant wasn’t awesome). Hell, there are a lot of arguments that you should.

    It inherently becomes a question of what your primary use case for a machine is and how often you spend fighting it to accomplish that. And, personally, I run Linux so I DON’T have to fight my OS. Which… is really weird when you think about it but holy crap Windows and Mac are annoying.

    Immutable OSes are amazing for corporate environments and HTPC/Gaming computers are another solid use case. But if your primary focus is whether you can be a developer (as indicated by the doomemacs ask)… you are gonna be cranky.


  • I’m of a few minds on this.

    First and foremost: I am a huge Gamers Nexus fan and think Steve et al are a great complement to Wendell when it comes to the decade of Year Of The Linux Desktop. And I love that they actually addressed the elephant in the room where… quite often you actively don’t want to use the linux binaries for a game.

    But I do think that having Linux as the second class benchmarks are inherently going to cause problems. Assuming they stick to doing a batch every couple months, that… okay, ain’t nobody actually buying hardware unless they have to. But still. And this was apparently collected during one of the months where nVidia was a complete shitshow. But Dragon’s Dogma 2 completely breaking is the kind of thing where… look, I became WAY too aware of exactly how denuvo registers a machine while I was debugging that. I was able to get DD2 to run beautifully on my PC but… as a HUGE DD1 fan even I think I wasted my life doing that (would do it again though).

    But I keep thinking of how many Influencers have done a variant of “tech isn’t fun anymore”. And… it kind of isn’t. But from the editorializing from Steve et al over the past year or so, it is clear they are excited that things are actually changing sometimes week to week and so many of these problems are ACTUALLY solvable by users. Sometimes it is trivial (check protondb for what settings) and sometimes you find yourself going down a rabbit hole of just how bad the Nioh 2 PC port actually was.

    I suspect this ends with the vast majority of outlets embracing “XBOX For PC” in a year or two… and Steve looking even more like the crazy old man of PC reviews. But I do think this will go a long way towards helping the fence sitters get away from MS.



  • I use Bazzite for my HTPC (AMD NUC).

    For a “set it and forget it” gaming console experience? It is awesome. It feels like I already have a GabeCube under my TV (that I bought for probably half the price…). And when I have to do more complicated things than “run the update once a month”, I just ssh in from either my desktop or laptop.

    But… it is an immutable/atomic distro. So if the packages you want to add are flatpaks or appimages? You are probably fine. Otherwise? You get into a mess where you are adding packages to your layers (?) and kinda feel like you are playing with fire. I did that to get iperf3 installed to test some networking upgrades and it was mostly painless but it was also a really bad experience versus sudo dnf install iperf3. And… even on machines where I spend 90% of my time ssh’ing into servers, I still tend to want to install a good amount of local packages as a developer.

    So my suggestion would be to stick to Bazzite for gaming first platforms and continue to use whatever distro you like (Fedora for the win!) for “real” computers.


    Also, if you aren’t as annoyed by atomic distros as I am, I would still be wary of Bazzite. They have a lot of different SKUs and I don’t care enough to try to parse what each one does. But the common use case is to basically treat a machine like a Steam Deck… which means you boot into Big Picture with essentially no login screens or a REALLY insecure pin code. And then you switch to desktop mode with a single click.

    There are ways to harden that (and very much an argument of whether you need to harden a machine in your home). And Linux, generally, has very good protections by actually requiring auth for sudo. But I already feel sketchy that I am logged into Steam/GoG on a box with almost no protections. But I also live in an environment where I don’t have to worry about someone buying 10k in fortnite bucks on my TV.


  • A lot of people don’t understand how AI training and AI inference work, they are two completely separate processes.

    Yes, they are. Not sure why you are bringing that up.

    For those wondering what the actual difference is (possibly because they don’t seem to know):

    At a high level, training is when you ingest data to create a model based on characteristics of that data. Inference is when you then apply a model to (preferably new) data. So think of training as “teaching” a model what a cat is, and inference as having that model scan through images for cats.

    And a huge part of making a good model is providing good data. That is, generally speaking, done by labeling things ahead of time. Back in the day it was paying people to take an amazon survey where they said “hot dog or no hot dog”. These days… it is “anti-bot” technology that gets that for free (think about WHY every single website cares what is a fire hydrant or a bicycle…)

    But that is ALSO just simple metrics like “Did the user use what we suggested”. Instead of saying “not hot dog” it is “good reply” or “no reply” or “still read email” or “ignored email” and so forth.

    And once you know what your pain points are with TOTALLY anonymized user data, you can then “reproduce” said user data to add to your training set. Which is the kind of bullshit facebook, allegedly, has done for years where they’ll GLADLY delete your data if you request it… but not that picture of you at the McDonald’s down the street because that belongs to Ronjon Buck who worked there one summer. But they’ll gladly anonymize your user data so the picture of you actually just corresponds to “User 25156161616” that happens to be the sibling of your sister and so forth…

    in fact a lot of research is being done right now trying to make it possible to do both because it would be really handy to be able to do them together and it can’t really be done like that yet.

    That is literally just a feedback loop and is core to pretty much any “agentic” network/graph.

    Go ahead and do so, they will have separate sections specifically about the use of data for training. Data privacy is regulated by a lot of laws, even in the United States, and corporate users are extremely picky about that sort of stuff.

    There also tend to be laws about opting in and forced EULA agreements. It is almost like the megacorps have acknowledged that they’ll just do whatever and MAYBE pay a fee after they have made so much more money already.


  • Understand that basically ANYTHING that “uses AI” is using you for training data.

    At its simplest, it is the old fashioned A/B testing where you are used as part of a reinforcement/labeling pipeline. Sometimes it gets considerably more bullshit as your very queries and what would make you make them are used to “give you a better experience” and so forth.

    And if you read any of the EULAs (for the stuff that google opted users into…) you’ll see verbiage along those lines.

    Of course, the reality is that google is going to train off our data regardless. But that is why it is a good idea to decouple your life from google as much as possible. It takes a long ass time but… no better time than today.


  • As it stands? Cloudflare is still incredibly effective at protecting customers from those DDOS attacks. Which, depending on your hosting solution, can mean very noticeable monetary savings because YOUR hardware/connection didn’t spike. And, regardless, can mean noticeable monetary savings as your engineers didn’t need to recover a crashed system because your setup was just sitting there idle.

    That said: If you truly need high availability? You need to do what downdetector did and have alternatives ready in the event that Cloudflare falls over. Same as with your ISP… which should be ISPs plural.








  • Agentic AI is just a buzzword for letting AI do things without human supervision

    No, it isn’t.

    As per IBM https://www.ibm.com/think/topics/agentic-ai

    Agentic AI is an artificial intelligence system that can accomplish a specific goal with limited supervision. It consists of AI agents—machine learning models that mimic human decision-making to solve problems in real time. In a multiagent system, each agent performs a specific subtask required to reach the goal and their efforts are coordinated through AI orchestration.

    The key part being the last sentence.

    Its the idea of moving away from a monolithic (for simplicity’s sake) LLM into one where each “AI” serves a specific purpose. So imagine a case where you have one “AI” to parse your input text and two or three other “AI” to run different models based upon what use case your request falls into. The result is MUCH smaller models (that can often be colocated on the same physical GPU or even CPU) that are specialized rather than an Everything model that can search the internet, fail at doing math, and tell you you look super sexy in that minecraft hat.

    And… anyone who has ever done any software development (web or otherwise) can tell you: That is just (micro)services. Especially when so many of the “agents” aren’t actually LLMs and are just bare metal code or databases or what have you. Just like how any Senior engineer worth their salt can point out that isn’t fundamentally different than calling a package/library instead of rolling your own solution for every component.

    The idea of supervision remains the same. Some orgs care about it. Others don’t. Just like some orgs care about making maintainable code and others don’t. And one of the bigger buzz words these days is “human in the loop” to specifically provide supervision/training data.

    But yes, it is very much a buzzword.




  • From everything we have heard… I would be shocked if it wasn’t pretty damned close.

    Gamers Nexus touched on the pricing info they were given. Go watch the video to confirm but off the top of my head:

    • The Steam Machine will be priced competitively with an entry level computer
    • The Steam Frame will be below the price of an Index

    So what that translates to is

    • The Steam Machine will likely be in the 800-1500 USD range
    • The Steam Frame will be up to 1000 USD

    Which… sounds about right. The Steam Frame is going to use a comparatively cheap Snapdragon processor but it still needs all the HMD tech. The Facebook Quest 3 is around 500 USD and considering economy of scale… that is probably the price floor for the Steam Frame.

    And the Steam Machine? That is rocking a proper Zen 4 with 16 gigs of DDR5 and 8 gigs of DDR6. Considering how expensive RAM already is and how that probably ain’t going down until late 2026 at the earliest… And it is worth noting that people lost their shit over the ROG XBOX ALLY X S 45 WHATEVER being 1k but… spec wise that lines up with similar laptops. The display is a decent chunk of that, which the Steam Machine won’t have, but… yeah.

    Computers is expensive. Especially in a Post Liberation Day world. It will be a miracle if the base console price (because you can bet the PS6 is gonna do the same stupid bullshit MS did with the Series S…) is below 900 USD with the “real” price being well over 1k. And the Steam Machine is going to be priced along those lines because Valve (presumably) doesn’t have a bunch of warehouses full of parts from five years ago.


    The good news is that if you already have a gaming PC, and don’t need the Valve branding, you can get a pretty solid AMD NUC for 300-600 USD that will run Bazzite perfectly and play a lot of your games locally with the rest streaming over Moonlight or Steam Link. GMKtec pretty much have this market on lock and I personally love my K11 (overkill but also really nice to not have to walk upstairs to wake my desktop for every single game).

    You’ll have the same nonsense with HDMI 2.1 as the Steam Machine will (so VRR) and AMD but there are workarounds for that (basically you flash a displayport dongle to be REAL sketchy). And you’ll be able to take advantage of most of the software improvements Valve are pushing for SteamVR, SteamOS, and Steam Link that are going to be coming rapidly for the launch. MUCH less oomph but… people who are expecting proper 4k experiences out of a Steam Machine are lying to themselves.