Thanks to @[email protected] for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.
Bit in this context refers to the Shannon from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.
The paper gives specific numbers for specific contexts, too. It’s a helpful illustration for these concepts:
A 3x3 Rubik’s cube has 2^65 possible permutations, so the configuration of a Rubik’s cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.
Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.
The paper doesn’t talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain’s capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I’m still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.
10 shannons, that is, 10 bits, each with 50% probability would be equivalent to the amount of information gained from observing an event with 1/1024 chance of occurring, not 1/10. Thats because this unit gets combined multiplicatively. The wikipedia article mentions that if there are 8 possible events with equal probability, the information content would be 3 shannons.
Right, 1/1024 is 0.0009765625 or about 0.1%.
Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.
It’s an average. The difference between two humans will be much less than the difference between humans and machines.
ITT: A bunch of people who have never heard of information theory suddenly have very strong feelings about it.
Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.This piece is garbage.
Speaking which is conveying thought, also far exceed 10 bits per second.
There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.
Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I’m curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik’s cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?
EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That’s an important limitation in that it’s not trying to measure internal richness in unobserved thought.
So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.
The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik’s cube solving, memory contests).
It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn’t really change the result.
There’s also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as “subjective inflation”), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.
I still think visual processing seems to be faster than 10, but I’m now persuaded that it’s within an order of magnitude.
How are you measuring it?
That doesn’t really matter, because 1 bit is merely distinguishing between 1 and zero, or some other 2 component value.
Just reading a single word, you understand the word between about 30000 words you know. That’s about 15 bits of information comprehended.
Don’t tell me you take more than 1.5 second to read and comprehend one word.Without having it as text, free thought is CLEARLY much faster, and the complexity of abstract thinking would move the number way up.
1 thought is not 1 bit. But can be thousands of bits.BTW the mind has insane levels of compression, for instance if you think bicycle, it’s a concept that covers many parts. You don’t have to think about every part, you know it has a handlebar, frame, pedals and wheels. You also know the purpose of it, the size, weight range of speed and many other more or less relevant details. Just thinking bicycle is easily way more than 10 bits worth of information. But they are “compressed” to only the relevant parts to the context.
Reading and understanding 1 word, is not just understanding a word, but also understanding a concept and putting it into context. I’m not sure how to quantize that, but to quantize it as 1 bit is so horrendously wrong I find it hard to understand how this can in any way be considered scientific.
You are confusing input with throughput. They agree that the input is much greater. It’s the throughput that is so slow. Here’s the abstract:
This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∼109 bits/s. The stark contrast between these numbers remains unexplained and touches on fundamental aspects of brain function: what neural substrate sets this speed limit on the pace of our existence? Why does the brain need billions of neurons to process 10 bits/s? Why can we only think about one thing at a time? The brain seems to operate in two distinct modes: the “outer” brain handles fast high-dimensional sensory and motor signals, whereas the “inner” brain processes the reduced few bits needed to control behavior. Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.
He’s not.
Executive function has limited capacity, but executive function isn’t your brain (and there’s no reasonable definition that limits it to anything as absurd as 10 bits). Your visual center is processing all those bits that enter the eyes. All the time. You don’t retain all of it, but retaining any of it necessarily requires processing a huge chunk of it.
Literally just understanding the concept of car when you see one is much more than 10 bits of information.
I think that we are all speaking without being able to read the paper (and in my case, I know I wouldn’t understand it), so I think dismissing it outright without knowing how they are defining things or measuring them is not really the best course here.
I would suggest that Caltech studies don’t tend to be poorly-done.
There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.
There are hundreds of systems in your brain that are actively processing many, many orders of magnitude more than ten bits of information per second all the time. We can literally watch them do so.
It’s possible the headline is a lie by someone who doesn’t understand the research. It’s not remotely within the realm of plausibility that it resembles reality in any way.
There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.
That is quite the claim from someone who has apparently not even read the abstract of the paper. I pasted it in the thread.
You are confusing input with throughput.
No I’m not, I read that part. Input is for instance hearing a sound wave, which the brain can process at amazing speed, separating a multitude of simultaneous sounds, and translate into meaningful information. Be it music, speech, or a noise that shouldn’t be there. It’s true that this part is easier to measure, as we can do something similar, although not nearly as well on computers. As we can determine not only content of sounds, but also extrapolate from it in real time. The sound may only be about 2x22k bit, but the processing required is way higher. And that’s even more obviously way way way above 10 bit per second.
This is a very complex function that require loads of processing. And can distinguish with microsecond precision it reaches each ear to determine direction.
The same is the case with vision, which although not at all the resolution we think it is, requires massive processing too to interpret into something meaningful.Now the weird thing is, why in the world do they think consciousness which is even MORE complex, should operate at lower speed? That idea is outright moronic!!!
Edit:
Changed nanosecond to microsecond.
As I suggested to someone else, without any of us actually reading the paper, and I know I do not have the requisite knowledge to understand it if I did, dismissing it with words like “moronic” is not warranted. And as I also suggested, I don’t think such a word can generally be applied to Caltech studies. They have a pretty solid reputation as far as I know.
I’m not fucking reading a paper with such ridiculous claims, I gave it a chance, but it simply isn’t worth it. And I understand their claims and argumentation perfectly. They simply don’t have a clue about the things they make claims about.
I’ve been investigating and researching these issues for 40 years with an approach from scientific evidence, so please piss off with your claims of me not understanding it.Without evaluating the data or methodology, I would say that the chance you gave it was not a fair one. Especially since you decided to label it “moronic.” That’s quite a claim.
Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
Because it’s a Techspot article, of course they deliberately confuse you as to what “bit” means to get views. https://en.wikipedia.org/wiki/Entropy_(information_theory) seems like a good introduction to what “bit” actually means.
Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via=ihub
It doesn’t look like these “bits” are binary, but “pieces of information” (which I find a bit misleading):
“Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.
The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:
To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.
But our brains are not digital, so they cannot be measured in binary bits.
All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).
But it isn’t stored that way and it isn’t processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.
Your initial claim was that they couldn’t be measured that way. You’re right that they aren’t stored as bits, but it’s irrelevant to whether you can measure them using bits as the unit of information size.
Think of it like this: in the 1980s there were breathless articles about CD ROM technology, and how, in the future, “the entire encyclopedia Britannica could be stored on one disc”. How was that possible to know? Encyclopedias were not digitally stored! You can’t measure them in bits!
It’s possible because you could define a hypothetical analog to digital encoder, and then quantify how many bits coming off that encoder would be needed to store the entire corpus.
This is the same thing. You can ADC anything, and the spec on your ADC defines the bitrate you need to store the stream coming off… in bits (per second)
Caltech article: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
The full text of the paper costs $35 to read once.
“Look, I made a really exciting controversial discovery! It’s really emotional and intriguing! You’re missing out! Only smart rich people can read it! Put your money in the basket please :)” Our education system is dead the the populace is too stupid to care.
The educational system isn’t setting the prices. The publishers are separate private enterprises which are mostly profit-driven.
In the last 20 years, “open access” journals have been created where the author (author’s grant money, mostly from the government) pays the charges instead of the readers. That has led to a whole slew of other problems including predatory and phony journals springing up.