• 0 Posts
  • 447 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle
  • It’s pretty much a vibe coding issue. What you describe I can recall being advocated forevet, the project manager’s dtram that you model and spec things out enough and perfectly model the world in your test cases, then you are golden. Except the world has never been so convenient and you bank on the programming being reasonably workable by people to compensate.

    Problem is people who think they can replace understanding with vibe coding. If you can only vibe code, you will end up with problems you cannot fix and the LLM can’t either. If you can fix the problems, then you are not inclined to toss overly long chunks of LLM stuff because they generate ugly hard to maintain code that tends to violate all sorts of best practices for programming.





  • This all presumes that OpenAI can get there and further is exclusively in a position to get there.

    Most experts I’ve seen don’t see a logical connection between LLM and AGI. OpenAI has all their eggs in that basket.

    To the extent LLM are useful, OpenAI arguably isn’t even the best at it. Anthropic tends to make it more useful than OpenAI and now Google’s is outperforming it on relatively pointless benchmarks that were the bragging point of OpenAI. They aren’t the best, most useful, or cheapest. The were first, but that first mover advantage hardly matters when you get passed.

    Maybe if they were demonstrating advanced robotics control, but other companies are mostly showing that whole OpenAI remains “just a chatbot”, with more useful usage of their services going through third parties that tend to be LLM agnostic, and increasingly I see people select non OpenAI models as their preference.



  • Yeah, but in relatively small volumes and mostly as a ‘gimmick’.

    The Cell processors were ‘neat’ but enough of a PITA is to largely not be worth it, combined with a overall package that wasn’t really intended to be headless managed in a datacenter and a sub-par networking that sufficed for internet gaming, but not as a cluster interconnect.

    IBM did have higher end cell processors, at predictable IBM level pricing in more appropriate packaging and management, but it was pretty much a commercial flop since again, the Cell processor just wasn’t worth the trouble to program for.


  • Unlikely.

    Businesses generally aren’t that stoked about anything other than laptops or servers.

    To the extent they have desktop grade equipment, it’s either:

    • Some kiosk grade stuff already cheaper than a game console
    • Workstation grade stuff that they will demand nVidia or otherwise just don’t even bother

    On servers, the steam machine isn’t that attractive since it’s not designed to either be slapped in a closet and ignored on slotted in a datacenter.

    Putting all this aside, businesses love simplicity in their procurement. They aren’t big on adding a vendor for a specific niche when they can use an existing vendor, even if in theory they could shave a few dollars in cost. The logistical burden of adding Steam Machine would likely offset any imagined savings. Especially if they had to own re-imaging and licensing when they are accustomed to product keys embedded in the firmware when they do vendor preloads today.

    Maybe you could worry a bit more about the consumer market, where you have people micro-managing costs and will be more willing to invest their own time, but even then the market for non-laptop home systems that don’t think they need nVidia but still need something better than integrated GPUs is so small that it shouldn’t be a worry either.




  • But the reason for the expense is largely the weight.

    Yes we can at great expense support massive weights. But even in skyscrapers, you aren’t expecting to just cram every floor with equipment that weighs over a ton and supported by less than a square meter of floor.

    It’s not just armchair engineering, i work in the industry and commonly you have racks preferring the ground floor and weight restrictions going up and even marked paths that the racks need to stay on when on upper floors due to limitations of the reinforcements.

    Skyscrapers are largely impractical structures done for the sake of showing off, with any value based on keeping people close to each other. No one builds a skyscraper by itself miles from anything else. This is where they build the datacenters because they don’t need proximity.


  • I don’t see my response as “blame”, it’s simply a statement that not every facet of your experience is necessarily unrelatable to others. This one is pretty innocuous, forgetting useful stuff for stuff that is emotionally impactful. This isn’t a bad thing, it’s just something that everyone experiences.

    I’m not a huge fan of the concept of just letting people believe certain facets of their lives are differences when in fact it’s a place for common ground. I dislike that any time humans organize ourselves into groups, we fixate on how we can minimize what we have in common with people outside the group, and fail to recognize commonality.


  • jj4211@lemmy.worldtoAutism@lemmy.worldDo you agree?
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    6 days ago

    But how can you be sure? You are comparing your internal perception of the phenomenon that no one else but you can perceive to the internal perception of others that you cannot perceive.

    Seems like it’s a matter of qualia, utterly subjective experience that is unshareable and thus incomparable between others.

    Not every facet of ones existence must be somehow be different between neurotypical and neurodivergent.





  • Note that this outage by itself, based on their chart, was kicking out errors over the span of about 8 hours. This one outage would have almost entirely blown their downtown allowance under 99.9% availability criteria.

    If one big provider actually provided 99.9999%, that would be 30 seconds of all outages over a typical year. Not even long enough for people to generally be sure there was an ‘outage’ as a user. That wouldn’t be bad at all.




  • I think that one was also significantly a publicity thing, they made videos and announced it as a neat story about the air force doing something “neat” and connecting relatable gaming platform to supercomputing. I’m sure some work was actually done, but I think they wouldn’t have bothered if the same sort of device was not so “cool”

    There were a handful of such efforts that pushed a few thousand units. Given PS3 volumes were over 80 million, I doubt Sony lost any sleep over those. I recall if anything Sony using those as marketing collateral to say how awesome their platform was. The losses from those efforts being well with the marketing collateral.