

funny how everyone who wants to write a new browser (except the ladybird guys) always skimp on writing the actual browser part
funny how everyone who wants to write a new browser (except the ladybird guys) always skimp on writing the actual browser part
in yes/no type questions, 50% success rate is the absolute worst one can do. Any worse and you’re just giving an inverted correct answer more than half the time
they are improving at an exponential rate. It’s just that the exponent is less than one.
got a pc with a good deal. First thing I did was electrically cut off all unnecessary leds
because it’s supposed to be usb. Which it’s not, intentionally
so? It was never advertised as intelligent and capable of solving any task other than that one.
Meanwhile slop generators are capable of doing a lot of things and reasoning.
One claims to be good at chess. The other claims to be good at everything.
when I need to type a dangerous command, i prepend it with #, so it’s just a comment.
Only when I’m really sure do i go back to the start of the line and remove the #
“i don’t care about that. Hhit was working and now it’s not” - the users
i bought an original cartridge and played it on the vcs i iherited from dad
i still enjoyed the crap out of it. Sometimes zoning out and just running around collecting stuff is just what I need.
localhost is “this device”.
connecting to localhost means connecting to something running on the same machine.
Browsers generally block connections to other domains (ex if you’re on google.com, the browser won’t simply let the site contact amazon.com willy-nilly).
But localhost is your own machine, so it is usually “trusted”. Facebook exploited this fact to exfiltrate data from the browser to the other apps running on your own phone, which would, in turn be free to do with it as they please, because they’re not the browser
he was forced to release it quickly to coincide with the film’s release. For comparison, it used to take a team of devs a couple of months to make a game. He had 6 weeks.
Also, if you read the manual, this essentially never happened to you. It was easy to avoid.
You also needed to read the manual. The game did stuff that other games at the time didn’t, for example, a contextual button. You couldn’t know what would happen unless you read the manual to learn what the icons meant. A lot of people never did and so decided that the game was bad.
when climbing out of the pit, it was very easy to immediately fall back down (due to the pixel-perfect collision detection).
And here is an excerpt from the manual: “Even experienced extraterrestrials sometimes have difficulty levitating out of wells. Start to levitate E.T. by first pressing the controller button and then pushing your Joystick forward. E.T.'s neck will stretch as he rises to the top of the well (see E.T. levitating in Figure 1). Just when he reaches the top of the well and the scene changes to the planet surface (see Figure 2), STOP! Do not try to keep moving up. Instead, move your Joystick right, left, or to the bottom. Do not try to move up, or E.T. might fall back into the well.”
it was actually way ahead of its time, for a game. One small bug (the workaround for which was in the manual) ruined its reputation. But I genuinely think it was a good game.
Also written in 6 weeks by one guy. Freaking impressive
I was 14 years old, and I got the 128meg stick for free. Beggars can’t be choosers haha
i started using linux on a single core pentium 4 with 384M of ram
I’m partial to this one
because the over 70 different binaries of systemd are “not modular” because they are designed to work together. What makes a monolith is, apparently, the name of the overarching project, not it being a single binary (which again, it’s not)
you wouldn’t be “freezing” anything. Each possible combination of input tokens maps to one output probability distribution. Those values are fixed and they are what they are whether you compute them or not, or when, or how many times.
Now you can either precompute the whole table (theory), or somehow compute each cell value every time you need it (practice). In either case, the resulting function (table lookup vs matrix multiplications) takes in only the context, and produces a probability distribution. And the mapping they generate is the same for all possible inputs. So they are the same function. A function can be implemented in multiple ways, but the implementation is not the function itself. The only difference between the two in this case is the implementation, or more specifically, whether you precompute a table or not. But the function itself is the same.
You are somehow saying that your choice of implementation for that function will somehow change the function. Which means that according to you, if you do precompute (or possibly cache, full precomputation is just an infinite cache size) individual mappings it somehow magically makes some magic happen that gains some deep insight. It does not. We have already established that it is the same function.
imagine that to type one letter, you need to manually read all unicode code points several thousand times. When you’re done, you select one letter to type.
Then you start rereading all unicode code points again for thousands of times again, for the next letter.
That’s how llms work. When they say 175 billion parameters, it means at least that many calculations per token it generates