

Would be interesting to see the stats for revenue by game, price by volume. If someone charges 300 for a game that no one bought. Then it shouldn’t count, hypothetically.


Would be interesting to see the stats for revenue by game, price by volume. If someone charges 300 for a game that no one bought. Then it shouldn’t count, hypothetically.


While true, the issue is that Debian release cadence is such that they will always be “behind” kernel and wine wise.
Also they are more purist and less likely to facilitate proprietary bits. Last time I tried wine a lot of apps didn’t work because they had no work to enable non-free fints So they may have the same general packaging strategy, but the vintage of content and scope are distinctly different from more aggressive distributions.


What tax prep software do you use? I thought they had all gone to just webapps at this point …


This all presumes that OpenAI can get there and further is exclusively in a position to get there.
Most experts I’ve seen don’t see a logical connection between LLM and AGI. OpenAI has all their eggs in that basket.
To the extent LLM are useful, OpenAI arguably isn’t even the best at it. Anthropic tends to make it more useful than OpenAI and now Google’s is outperforming it on relatively pointless benchmarks that were the bragging point of OpenAI. They aren’t the best, most useful, or cheapest. The were first, but that first mover advantage hardly matters when you get passed.
Maybe if they were demonstrating advanced robotics control, but other companies are mostly showing that whole OpenAI remains “just a chatbot”, with more useful usage of their services going through third parties that tend to be LLM agnostic, and increasingly I see people select non OpenAI models as their preference.


Problem is that AI didn’t present as a “genre” and you get AI slop across the gamut.
I started a video because the title seemed like something I was interested in and the thumbnail seemed fine. Then within the first few seconds it was obviously lazy ai slop.
Short of limiting yourself to known acceptable channels. You can’t really stave off the AI slop. Some categories get hit less often, but they are all over the place.


Yeah, but in relatively small volumes and mostly as a ‘gimmick’.
The Cell processors were ‘neat’ but enough of a PITA is to largely not be worth it, combined with a overall package that wasn’t really intended to be headless managed in a datacenter and a sub-par networking that sufficed for internet gaming, but not as a cluster interconnect.
IBM did have higher end cell processors, at predictable IBM level pricing in more appropriate packaging and management, but it was pretty much a commercial flop since again, the Cell processor just wasn’t worth the trouble to program for.


Unlikely.
Businesses generally aren’t that stoked about anything other than laptops or servers.
To the extent they have desktop grade equipment, it’s either:
On servers, the steam machine isn’t that attractive since it’s not designed to either be slapped in a closet and ignored on slotted in a datacenter.
Putting all this aside, businesses love simplicity in their procurement. They aren’t big on adding a vendor for a specific niche when they can use an existing vendor, even if in theory they could shave a few dollars in cost. The logistical burden of adding Steam Machine would likely offset any imagined savings. Especially if they had to own re-imaging and licensing when they are accustomed to product keys embedded in the firmware when they do vendor preloads today.
Maybe you could worry a bit more about the consumer market, where you have people micro-managing costs and will be more willing to invest their own time, but even then the market for non-laptop home systems that don’t think they need nVidia but still need something better than integrated GPUs is so small that it shouldn’t be a worry either.


Consoles are sold at a loss, and they recover it with games because the platform is closed.
Sometimes, but evidently not currently. Sources seem to indicate that only Microsoft seems to say they are selling at a loss, though it seems odd since their bill of materials looks like it should be pretty comparable to PS5…
I’ll agree with the guess of around $800, but like you say, the supply pressure on RAM and storage as well as the tariff situation all over the place, hard to say.


I think it’s a response to the sentiment that Sony somehow got bit by selling PS3 at a loss because it triggered some huge supercomputing purchases of the systems that Sony wouldn’t have liked, and that if Valve got too close to that then suddenly a lot of businesses would tank it by buying too much and never buying any games.
Sony loved the exposure and used it as marketing fodder that their game consoles were “supercomputer” class. Just like they talked up folding@home on them…


But the reason for the expense is largely the weight.
Yes we can at great expense support massive weights. But even in skyscrapers, you aren’t expecting to just cram every floor with equipment that weighs over a ton and supported by less than a square meter of floor.
It’s not just armchair engineering, i work in the industry and commonly you have racks preferring the ground floor and weight restrictions going up and even marked paths that the racks need to stay on when on upper floors due to limitations of the reinforcements.
Skyscrapers are largely impractical structures done for the sake of showing off, with any value based on keeping people close to each other. No one builds a skyscraper by itself miles from anything else. This is where they build the datacenters because they don’t need proximity.
I don’t see my response as “blame”, it’s simply a statement that not every facet of your experience is necessarily unrelatable to others. This one is pretty innocuous, forgetting useful stuff for stuff that is emotionally impactful. This isn’t a bad thing, it’s just something that everyone experiences.
I’m not a huge fan of the concept of just letting people believe certain facets of their lives are differences when in fact it’s a place for common ground. I dislike that any time humans organize ourselves into groups, we fixate on how we can minimize what we have in common with people outside the group, and fail to recognize commonality.
But how can you be sure? You are comparing your internal perception of the phenomenon that no one else but you can perceive to the internal perception of others that you cannot perceive.
Seems like it’s a matter of qualia, utterly subjective experience that is unshareable and thus incomparable between others.
Not every facet of ones existence must be somehow be different between neurotypical and neurodivergent.


Yes, just some people figuring out that Grok was steered toward ass-kissing Musk no matter what, and exploited that for funny output. So the takeaways are:


help explain the relationships in a complicated codebase succinctly
It will offer an explanation, one that sounds consistent, but it’s a crap shoot as to whether or not it accurately described the code, and no easy of knowing of the description is good or bad without reviewing it for yourself.
I do try to use the code review feature, though that can declare bug based on bad assumptions often as well. It’s been wrong more times than it caught something for me.


No, just complete. Whatever the dude does may have nothing to do with what you needed it to do, but it will be “done”
Note that this outage by itself, based on their chart, was kicking out errors over the span of about 8 hours. This one outage would have almost entirely blown their downtown allowance under 99.9% availability criteria.
If one big provider actually provided 99.9999%, that would be 30 seconds of all outages over a typical year. Not even long enough for people to generally be sure there was an ‘outage’ as a user. That wouldn’t be bad at all.


One this is all speak to convince investors to throw money, so they’ll cheer pick their interpretation.
In this case I think they refer to already having the real estate, buildings, power and cooling. So “all” they have to do is rip out their rigs and dump a bunch of nVidia gear in. All they need is just a few hundred million from some lucky investors and they will be off…


Same way a lot of the “ai” companies make money, investors that have no idea but want to get in on the ground floor of the next nVidia or openai.


I think that one was also significantly a publicity thing, they made videos and announced it as a neat story about the air force doing something “neat” and connecting relatable gaming platform to supercomputing. I’m sure some work was actually done, but I think they wouldn’t have bothered if the same sort of device was not so “cool”
There were a handful of such efforts that pushed a few thousand units. Given PS3 volumes were over 80 million, I doubt Sony lost any sleep over those. I recall if anything Sony using those as marketing collateral to say how awesome their platform was. The losses from those efforts being well with the marketing collateral.
It’s pretty much a vibe coding issue. What you describe I can recall being advocated forevet, the project manager’s dtram that you model and spec things out enough and perfectly model the world in your test cases, then you are golden. Except the world has never been so convenient and you bank on the programming being reasonably workable by people to compensate.
Problem is people who think they can replace understanding with vibe coding. If you can only vibe code, you will end up with problems you cannot fix and the LLM can’t either. If you can fix the problems, then you are not inclined to toss overly long chunks of LLM stuff because they generate ugly hard to maintain code that tends to violate all sorts of best practices for programming.