

Did not expect to see a Voosh video in my [email protected], hopefully folks here won’t be put off by his strong (but accurate) words to describe the situation.
Yo whatup
Did not expect to see a Voosh video in my [email protected], hopefully folks here won’t be put off by his strong (but accurate) words to describe the situation.
More complex forms of reasoning in the context of “Reasoning Systems” is video game NPC Ai. They take the current game state and “reason” about what action they should take now or even soon in the future. Really good video game Ai will use your velocity to pre-aim projectiles at where you’ll be in the future instead of where you are currently. The NPC analogy is one of the very thing’s being described by the term
If you truly believe that you fundamentally misunderstand the definition of that word or are being purposely disingenuous as you Ai brown nose folk tend to be. To pretend for a second you genuinely just don’t understand how to read LLMs, the most advanced “Ai” they are trying to sell everybody is as capable of reasoning as any compression algorithm, jpg, png, webp, zip, tar whatever you want. They cannot reason. They take some input and generate an output deterministically. The reason the output changes slightly is because they put random shit in there for complicated important reasons.
Again to recap here LLMs and similar neural network “Ai” is as capable of reasoning as any other computer program you interact with knowingly or unknowingly, that being not at all. Your silly Wikipedia page is a very specific term “Reasoning System” which would include stuff like standard video game NPC Ai such as the zombies in Minecraft. I hope you aren’t stupid enough to say those are capable of reasoning
They can’t reason. LLMs, the tech all the latest and greatest still are, like GPT5 or whatever generate output by taking every previous token (simplified) and using them to generate the most likely next token. Thanks to their training this results in pretty good human looking language among other things like somewhat effective code output (thanks to sites like stack overflow being included in the training data).
Generating images works essentially the same way but is more easily described as reverse jpg compression. You think I’m joking? No really they start out with static and then transform the static using a bunch of wave functions they came up with during training. LLMs and the image generation stuff is equally able to reason, that being not at all whatsoever
No actually! Musks entire involvement with PayPal was being fired by the company he founded which then later down the road was bought out by PayPal when the people who fired him for incompetence turned it around and made it valuable enough PayPal wanted it
I presume you don’t live in the US because lol. Living is more expensive than ever before in history but shit like phones and TVs cheap yay?
For now, they are dirt cheap for now and extremely unprofitable. We’re looking at more than a doubling of subscription price before they even approach profitability
What’s wrong with liking cuties?
Sometimes you don’t feel like eating a large orange.
I know who it is, gimme a link to the art you doofus
This pizza has NO GODDAMN SAUCE
Most subs are still objectively worse than the sub in VA quality. That’s just how it is. It’s certainly gotten better and some dubs such as the Konosuba dub are fantastic, so good it’s worth watching both
Well they are an Ex chairman so hopefully that’s a good sign?
LLMs making you code faster means your slow not LLMs fast
It’s not an assumption it’s just a matter of practical reality. If we’re at best a decade off from that point why pretend it could suddenly unexpectedly improve to the point it’s unrecognizable from its current state? LLMs are neat, scientists should keep working on them and if it weren’t for all the nonsense “Ai” hype we have currently I’d expect to see them used rarely but quite successfully as it would be getting used off of merit, not hype.
I did not always use an ad blocker. Frankly my stance is they’ve made their bed, now they get to lie down in it. If it wasn’t for YouTubes ad quality being so poor I would have never bothered taking the 30 seconds it takes to install an adblock those 10 or so years ago
Okay so imagine for a second that somebody just invented voice to text and everyone trying to sell it to you lies about it and claims it can read your thoughts and nobody will ever type things manually ever again.
The people trying to sell us LLMs lie about how they work and what they actually do. They generate text that looks like a human wrote it. That’s all they do. There’s some interesting attributes of this behavior, namely that when prompted with text that’s a question the LLM will usually end up generating text that ends up being an answer. The LLM doesn’t understand any part of this process any better than your phones autocorrect, it’s just really good at generating text that looks like stuff it’s seen in training. Depending on what exactly you want this thing to do it can be extremely useful or a complete scam. Take for example code generation. By and large they can generate code mostly okay, I’d say they tend to be slightly worse than a competent human. Genuinely really impressive for what it is, but it’s not revolutionary. Basically the only actual use cases for this tech so far has been glorified autocomplete. It’s kind of like NFTs or Crypto at large, there is actual utility there but nobody who’s trying to sell the idea to you is actually involved or cares about that part, they just want to trick you into becoming their new money printer.
Douno off the top of my head. To take a wild guess they might just wrap a file handle and give it s nice api? If that’s what they do then moving from the file zeros out he handle for basically the same reason smart pointers set their internal pointer to nullptr, so they don’t delete it (or close the file in this case) underneath the new object.
Depends on the object what happens when they are moved from. Some objects are put into a valid moved from state (usually depends on if doing so is free or required anyway. For example to satisfy the invariant of the moved to unique pointer the moved from pointer needs to be set to nullptr in order to prevent the moved tos unique pointer being deleted from underneath it)
I caught up, chapter 50. Um so frankly put this is just porn. I’m genuinely kind of invested now though so fuck it, this is going on my trackers
Ironically you might have restricted mode enabled. Check your settings in YouTube and disable it if it is