• Australis13@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    15 days ago

    Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via=ihub

    It doesn’t look like these “bits” are binary, but “pieces of information” (which I find a bit misleading):

    “Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.

    The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:

    To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.

      • Tramort@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).

        • Flying Squid@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          15 days ago

          But it isn’t stored that way and it isn’t processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.

          • Tramort@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 days ago

            Your initial claim was that they couldn’t be measured that way. You’re right that they aren’t stored as bits, but it’s irrelevant to whether you can measure them using bits as the unit of information size.

            Think of it like this: in the 1980s there were breathless articles about CD ROM technology, and how, in the future, “the entire encyclopedia Britannica could be stored on one disc”. How was that possible to know? Encyclopedias were not digitally stored! You can’t measure them in bits!

            It’s possible because you could define a hypothetical analog to digital encoder, and then quantify how many bits coming off that encoder would be needed to store the entire corpus.

            This is the same thing. You can ADC anything, and the spec on your ADC defines the bitrate you need to store the stream coming off… in bits (per second)