back to article Boffins ponder paltry brain data rate of 10 bits per second

Caltech researchers have estimated the speed of human thought to be a mere 10 bits per second, a data rate so leisurely that it underscores the need for further research into brain function and calls into question claims about brain-computer interfaces and artificial intelligence. In a paper [PDF] titled, "The Unbearable …

  1. b0llchit Silver badge
    Meh

    "Because we could engage in any one of the 2^10 possible actions or thoughts in the next second,...

    And therein lies the crux of the story.

    It is not about the speed of sequentially processed data. It is about the next state, which is determined in parallel. You do not need to do many, many fast serial calculations. You simply do a very slow parallel one to get to the same conclusion. It is probably more energy efficient the way it works, maybe not optimal, but probably better than other topologies.

    Though, I guess that there are some more than 1024 (210) next states as suggested. The brain is also likely not to be a purely parallel binary choice mechanism as there is a temporal component too.

    1. Persona Silver badge

      Yes its a hugely parallel process. Human brains can take in a complex visual scene and identify features and faces in an instant. I could believe a frame rate of 10 frames a second maxes this out but it's processing huge amounts of data in parallel to achieve this.

      1. Pascal Monett Silver badge

        10 frames per second ? I seem to recall that films are projected at 25 frames per second, and I haven't heard anyone complain about that.

        1. JamesTGrant Bronze badge

          Ability to absorb complex information from images tops out at 10 images a second (if you are really concentrating hard). Ability to distinguish between separate images and continuous motion is dependent on which part of the eye you are looking with, and how bright the image is (among other things). Film (35mm) is/was 24fps but each frame is shown twice or three times (hence the shutter on a cinema projector) to reduce the ‘flicker’ you’d otherwise notice. (Flicker, not motion judder…)

          Flicker is still noticeable for many (not all) at 50 images per second (even if the images are duplicated). The vast majority of people can not detect flicker at 75 images per second when viewing head-on. All of this assumes a duty cycle where the image is displayed very quickly after the previous image.

          Note - this isn’t about motion judder, it’s about flicker.

          1. anonymous boring coward Silver badge

            Perhaps..

            But the ability to pick the right 10 images per second to process would rely on the actual frame rate being much, much higher.

            Just ask any FPS gamer, or virtual racing driver, or virtual combat flightsimer. In real life, of course, the brain would pick different things from different areas of sight at overlapping times, so no full-frame FPS can accurately be used to find the limits of this processing.

            Just like our hearing is vastly more advanced than can be deduced from simple experimentation.

      2. Helcat Silver badge

        Well... not quite...

        the brain cheats. It parses some elements of the scene and uses pattern recognition as part of the process to guess what it's seeing. Mostly it's looking for friendly and hostile shapes, and is why you might scan a crowd and think you recognise someone only to then realise they look similar-but-not-really-that-close to who you thought you saw. It's also why we see things that aren't there: We catch enough of a shape to suggest an object and the brain fills in the blanks as a guess - and got it wrong.

        It's the reason why drivers miss cyclists and motorbikes at junctions: Vertical objects aren't a threat to our primitive minds: They're trees or lamp-posts, so we don't 'see' them at first. Hence SMIDSY (sorry mate, I didn't see you). The eye did, but the brain didn't identify the approaching object. Hence 'Look once, look twice, look bike': More chance the brain will realise objects have changed and suddenly it's paying attention.

        So no, it's not really processing huge amounts of data: It's processing some of the data and using pre-processed images to fill in the blanks in the hope to spot the predator that is creeping up on us before it pounces, or the prey we're hunting and is trying to hide. It's the intuitive leap that helps survival in dangerous situations. Only it does get things wrong. Sometimes disastrously wrong.

        Like I said: it cheats.

      3. TRT

        They did say that was the "inner" brain and not the "outer" sensory processing bit. I was about to get very shouty until I read that bit. They appear to be talking about the speed of consciousness, often thought to be modulated on a 40Hz synchronisation wave. But if consciousness resided in just a few bits of the brain, then yes, you would not need a very high bandwidth interface at all. Problem is that consciousness resides simultaneously in about 500 different brain regions (I'm guessing - it's at least 12 that I know of from my work in a brain lab and we were only looking at one sensory modality), the "fingers" that Elon is talking about as an interface (coupled with speech I suppose and other subliminal means on communication) integrate over all these parallel processing consciousnesses very efficiently.

  2. Jou (Mxyzptlk) Silver badge

    "listening comprehension in English (13 bits/s)" Huh?

    How exaclty do they measure that listening to English, in my case second language, is only 13 bits per second? How slow has a typical movie to be slowed down to speak this slow?

    The 40 bits/s later mentioned in the article sounds more likely, if syllables can be interpreted as one byte each.

    One the other hand: "binary digit memorization (4.9 bits/s)" huh what? No, I am not that good at memorizing arbitrary numbers and have to write them up if they are important. Or copy paste them. Only a few really important numbers are actually memorized in my brain, everything else has more or less than one or a few second(s) TTL :D.

    1. katrinab Silver badge
      Meh

      Re: "listening comprehension in English (13 bits/s)" Huh?

      I'm pretty sure there isn't an ADC module in our brain, so talking about bits per second is surely pretty meaningless?

      We do convert the sound waves into some sort of underlying meaning, and while you could represent that in binary, I don't think that is necessarily what actually happens.

      1. Jou (Mxyzptlk) Silver badge

        Re: "listening comprehension in English (13 bits/s)" Huh?

        Which is exactly my point! Completely out of whack measurement. Will probably on Sabine Hossenfelders next youtube video "A study so stupid it makes all 'new quantum physics' suddenly look good! Even string theory is better than that! Even my haircut looks better, which it quite a stretch!"

      2. I am David Jones Silver badge

        Re: "listening comprehension in English (13 bits/s)" Huh?

        In this context a bit is a unit of information content and not literally a digital bit as used in computing. For example, the words “heads” and “tails” contain only one bit of information each when referring to a tossed coin, irrespective of how many computing bits might be used to store those words as strings.

    2. Bebu sa Ware
      Windows

      Re: "listening comprehension in English (13 bits/s)" Huh?

      I would imagine that what is meant that thirteen bits of information is conveyed every second to the the listener.

      You might imagine an experiment where scripts of varying length and information density were read to test subjects with the subjects subsequently tested on the content for comprehension and retension. Devilishly difiicult to control confounding factors I would think.

      The information density and the usual rate of verbal delivery does vary between languages, I believe. I imagine the maximum rate of listening comprehension for all languages would approach something well under 20 bits/s.

    3. O'Reg Inalsin

      Re: "listening comprehension in English (13 bits/s)" Huh?

      Paraphrasing from an elsewhere reported version of this story - Typists typing words of random letters are extremely slow compared to when they type natural language, the difference resulting from natural language being so predictable. The information is more dense in the case of words with random letters, because their no predictable structure. Another way to look at it is that the time it takes a human typist to type a document depends more upon the size of zip++ file that can contain it then its expanded length. (Zip++ because language complexity is more than the number of unique words).

      That also explains the characteristic long winded and flowery language - padded with every possible platitude, while devoid of originality - seen in the output of ChatGPT when talking of weighty but soft human issues.

      1. anonymous boring coward Silver badge

        Re: "listening comprehension in English (13 bits/s)" Huh?

        Well, "predictable" may not be the best description. We learned patterns so that words we have seen before we can type quickly. Like the last sentence took me about 5 seconds to write. I didn't predict anything. I used all kinds of memory and some learned reflexes for finger movement. The brain can outsource some repetitive jobs to faster processing centres. The higher brain functions double check this after the fact (unless overloaded with other tasks, which is one reason we have checklists for things like aircraft piloting).

    4. SnailyFresh

      Re: "listening comprehension in English (13 bits/s)" Huh?

      That's the crux of any measurement. "The text of English at speaking pace is X bits per second,"

      especially if 'compressed' (that sentence has only 18 words, and the 2nd and 7th are the same).

      But the listening relays other information. If you listened to me read the line above, that would tell you about my body, my childhood, and my annoyance at data engineers who take a single measurement to cut a whole house plan.

      The difference between "1 2 3 4 5 6 2 7 4 8 9 10 11 12 13 14 15 16 17 18" and my voice speaking the sentence is a lot more than 40 bits. The speaker is male, American, middle seaboard, annoyed, and eating a peanut butter sandwich. But we can call these categories 2, 1, 4, 36, and 24-18 and the data engineer will call it 3 bytes.

      1. I ain't Spartacus Gold badge

        Re: "listening comprehension in English (13 bits/s)" Huh?

        Plus, at the same time as comprehending the incoming sentence you may also be thinking about how it's getting a bit warm in here, should I remove my jumper, open a window or both - perhaps while asking the speaker if they also have an opinion on the temperature. While also planning what you might say in response to them. Plus maybe thinking about anything you were planning to say to them but haven't got round to (or found the correct moment) yet. Then adding in the background processes like them mentioning the word banana, which reminds you to add bananas to your shopping list. At the same time you are also simultaneously processing a whole load of visual data and making determinations about that - such as you might also be driving a car while holding this conversation.

        The comment on trying to restore vision also suggests that the authors don't seem to get the complexity of what's going on. You can't adequately describe even a simple environment like the office I'm sitting in as fast as my brain can process it visually. OK, that's obvious. But that means my brain is able to process this data at faster than 10b/s - and even though we're accepting that all the ancillary parts of the brain are pre-processing most of this data at much faster rates - still the information strteam coming to my central brain thinking machine bit is clearly giving data at a fast rate than 10b/s even at the same time that I'm taking in audio cues - typing a comment online and thinking about the job I have to do next - and deciding whether it's time to make a run to the kettle before doing it.

        And that's even ignoring the weird sub-conscious processing thing we can do where you can be solving a cryptic crossword clue (or engineering problem) with literally no plan to do so whatever. The solution just randomly pops into your hear, hours (sometimes days) after you last consciously thought about it. I admit I often do this deliberately - but my subconscious doesn't take orders too well - as it will sometimes ship me answers to problems I wasn't even trying to solve a few hours after the fact.

        I'm guessing the outer bits of the brain doing higher speed might also count for things like music? I used to know a pianist who could sight-read the melody line of a piece but would often improvise the harmonies in the left hand - if the written piece wasn't complex enough for his tastes. I have had conversations with him while he was both sight-reading and improvising. He was a vastly better painist than me - so maybe he'd got sight-reading down to some kind of automatic process.

      2. Francis Boyle

        So what's the data rate of poetry?

        I'm also guessing there might be a bit of a data difference between a "Cat sat on the mat" sort of sentence and say a line from a Shakespearean sonnet.

    5. dvhamme

      Re: "listening comprehension in English (13 bits/s)" Huh?

      The thing is, the brain is like an encoder-decoder. You observe a huge bitrate of information, this is analyzed into patterns and semantics through sort of a processing funnel, which extracts the meaning you need to make decisions. After making a decision, forming an action as it were, this action is then again upsampled or constructed into a much higher bitrate, e.g. the motor control units of all your muscles. But what this research tries to establish, is the throughput of the bottleneck in the middle where decisions are made.

      I still disagree with the paper. We may be unable to find any single task that cannot be summarized into about 10bit/s of semantics, but that only proves that for THIS task, you could train a dedicated encoder-decoder that has a 10bit/s bottleneck. Our brain can do a huge amount of different tasks, including new and unpracticed ones, so I expect (without being an expert at biology, just computer scientist) we have something of a foundation model in our brain that extracts kilobits or even megabits per second of feature space on which we make decisions in a process that selects relevant information from this oversized all-tasks semantic world view. You could see it as having a whole battery of different 10bit/s encoders with the ability to recruit neurons for entire new encoders (learning a new task)without losing the old ones. Our actions are formed on a limited set of these encoders at any given time.

      You could probably more fairly say that we have a 10bit/s attention capability. In my opinion that attention shifts constantly and hence there is no single 10bit/s part of the brain that ALL information goes through, and that a brain-machine interface could tap into.

  3. Paul Johnston

    Love the title

    Obviously a fan of Kundera.

  4. Neil Barnes Silver badge
    Coat

    we predict that Musk’s brain will communicate with the computer at about 10 bits/s,

    Seems a generous estimate, given his reported output on social media...

    1. Ropewash

      Re: we predict that Musk’s brain will communicate with the computer at about 10 bits/s,

      We're talking quantity here. Quality is a whole other topic.

      1. blu3b3rry
        Devil

        Re: we predict that Musk’s brain will communicate with the computer at about 10 bits/s,

        It sounds like it would be cheaper to replace Musk with an LLM.

    2. spold Silver badge

      Re: we predict that Musk’s brain will communicate with the computer at about 10 bits/s,

      I've communicated with people (in some fashion) who I believe operate at much less.

    3. Anonymous Coward
      Anonymous Coward

      Re: we predict that Musk’s brain will communicate with the computer at about 10 bits/s,

      Mr Barnes,

      Please prepare to get banned on Twitter. Elongated Muskrat (or Maximus something as he was reported to have started calling himself) does not like any negative crit of him at all.

      You can join the likes of Laura Loomer who have been banned by his excellency recently.

      He can't ban be from his thing because I have never even seen a Twitter login screen.

      Welcome to the world of free speech, as long as said speech does not mention the lack of clothes that Emperor Musk is or is not wearing.

  5. Pete 2 Silver badge

    Thought experiment.

    Count the number of words in this article. Decide how many bits are required to encode each word. Calculate the amount of time it would take to assimilate the article's bit content at 10 bps.

    Compare that number with the time it took to read.

    Consider whether 10bps stands up to an empirical analysis.

    1. This post has been deleted by its author

    2. katrinab Silver badge

      Re: Thought experiment.

      It would be about 15 bits per word, but I don't think we always encode words individually, sometimes we might encode a group of words together, and the context the words are used in will affect how we encode them.

      Also, if you just encode the words, you ignore punctuation. Some of the "words" we use work in the same way as punctuation in giving meaning to the sentence.

      1. Will Godfrey Silver badge
        Happy

        Re: Thought experiment.

        We most sentences the have ability to understand also the with of words in the wrong order.

        1. Anonymous Coward
          Anonymous Coward

          Re: Thought experiment.

          We most sentences the have ability to understand also the with of words in the wrong order.

          This exactly. I read(*) the post that you're replying to by looking at three chunks for long enough to grok the meaning of each in context. I certainly didn't scan each word separately with distinct eye movements for each word or shorter cluster. It's amazing the kind of (pre-) processing that the brain does right from the visual cortex up until we consciously sense what it is that we're reading.

          (*probably "speed-read" is more accurate, but that's associated with techniques that are taught/learned... whereas this is just something that naturally happens when we just give our brains enough reading material to work with ... probably due to neuroplasticity)

    3. Jou (Mxyzptlk) Silver badge

      Re: Thought experiment.

      Obviously drunk snail reading speed :D.

    4. E 2

      Re: Thought experiment.

      Merely encoding the words zis insufficient. Now quantify processing them individually, then in sentences, then deriving meaning from them

      The research is idiotic.

      1. cyberdemon Silver badge
        Unhappy

        > The research is idiotic.

        Indeed. But it grabbed the headlines though!

        Academic "rage-baiting"

      2. anonymous boring coward Silver badge

        Re: Thought experiment.

        Perfect for YouTube "You won't BELIVE this ONE thing.."

    5. Jason Bloomberg Silver badge

      Re: Thought experiment.

      Count the number of words in this article. Decide how many bits are required to encode each word

      Remembering that words contain huge redundancy, that most vowels can be rmvd nd th mnng f sntncs cn stll b frly qkly dtrmnd.

      Reading that slows us down because we have to expand it first. The brain could be compressing even further into phonemes and what not, maybe a simple shape and colour for want of a better analogy.

      Slow movement of massively compressed parallel info, with some amazing 'magic' on top, seems a reasonable notion to me.

      Most westerners know Bohemian Rhapsody word and note perfect (near enough) - I'm wondering how much collective brain that has filled up?

      1. Neil Barnes Silver badge
        Headmaster

        Re: Thought experiment.

        most vowels can be rmvd nd th mnng f sntncs cn stll b frly qkly dtrmnd.

        Yes, but, didn't Shannon point out that by removing redundancy you increase the rate of none recoverable data loss? Certainly there are languages which encode alphabetically but don't use the vowels, and arguments over the correct vowel to use on ancient text continue for centuries... We might also consider the metadata in text, particularly whitespace and punctuation. We might not encode that internally but it certainly provides context for what we're reading. Your example is probably much harder to read when the spaces are also removed: rmvdndthmnngfsntncscnstllbfrlyqklydtrmnd.

        (Of which, if I recall correctly, spaces in text were a fairly recent invention. LookatsomeLatininscriptions.)

        1. DJO Silver badge

          Re: Thought experiment.

          I'm reasonably certain the brain does not use ASCII to encode text, I’m also reasonably sure it uses what we would call tokens for words and not a sequence of letters - you recall "the" not "T", H" & "E",

          And of course the brain is not a digital computer, it's analogue with more than 2 states for a neuron so comparisons are in most cases both meaningless and misleading.

      2. David Hicklin Silver badge

        Re: Thought experiment.

        And what about all the background processed going on?

        Like now I am reading all the comments, scrolling down the page as I go as well as eating lunch (OK, bit sliced with the mouse movement), not to mention any random crossing or uncrossing of your legs/feet etc?

        A lot of this is probably standard routines handled by the out faster bits bit the central decision making part that decides what to do - like should I scroll up or down the page also needs your eye input to decode what you are seeing.

        I think the degree of parallelism must be huge

        1. ArrZarr Silver badge
          Happy

          Re: Thought experiment.

          It's about the userspace of the brain, rather than the kernelspace that we're not allowed to futz with. With hard work, you can mess with some kernelspace attributes (training into or out of reactions, for e.g.), but there are things you just are not permitted to do - or would you rather have to remember to keep your heart beating while asleep? ;)

        2. DJO Silver badge

          Re: Thought experiment.

          Wrong model. The brain does not contain discrete processing units that do all the heavy lifting like a computer has. Instead the "processing" is done where it's needed (I suppose to keep the latency down) so a better model would be "distributed computing" instead of "parallel processing".

          There's also a lot of autonomous systems happily gurgling away and most of them are not fully under the control of the brain, the nervous system is more than just organic cat5, there's a lot of local processing too.

    6. DS999 Silver badge

      We have built in autocorrect while reading

      Aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe.

      I can read that at nearly normal speed, YMMV.

      1. I am David Jones Silver badge
        Trollface

        Re: We have built in autocorrect while reading

        Bet you couldn’t type it at nearly normal speed though!

      2. anonymous boring coward Silver badge

        Re: We have built in autocorrect while reading

        Yup, wasn't hard to read.

      3. ArrZarr Silver badge

        Re: We have built in autocorrect while reading

        Accordion to a recent study, 90% of people don't notice when a word in a sentence has been replaced with a musical instrument.

        1. Benegesserict Cumbersomberbatch Silver badge

          Re: We have built in autocorrect while reading

          Did the study viola te any ethics standards?

    7. vtcodger Silver badge

      Re: Thought experiment.

      Some English language speed readers can manage 1000 words per minute with reasonable comprehension. That is almost 17 words per second. And words are obviously way more than 1 bit each. I think these folks might have a very different definition of BIT than those of us in the computer and communication field use.

  6. Boris the Cockroach Silver badge
    Boffin

    The brain

    is massively parallel in its computing ability

    Take a learned task like... writing code, there all your attention is on typing out the commands while processing the design/spec for the code, while wondering whats for dinner (until Dinner appears as a class name)

    Now take another task such as driving, now you're still wondering whats for dinner while the radio is blasting away and you're driving while following a memory of howto get from work to home.

    Even at 10 bits per second you're outperforming any of tesla's self driving navigation thingys attatched to their cars. and thats before singing along to the song lyrics.

    So those 10 bits does'nt sound impressive until you think of how many background tasks are running

    Btw I was typing this out while listening to guns and roses and talking to the cat.

    1. E 2

      Re: The brain

      Did the cat answer?

      1. Doctor Syntax Silver badge

        Re: The brain

        A cat wouldn't deign to answer a human.

        1. spold Silver badge

          Re: The brain

          ...unless you had tuna in your pocket.

          1. I ain't Spartacus Gold badge
            Coat

            Re: The brain

            ...unless you had tuna in your pocket.

            Is that a tuna in your pocket, or are you just pleased to see me?

            1. I am David Jones Silver badge

              Re: The brain

              Is that a tuna in your pocket or are you just plaiced to see me?

          2. anonymous boring coward Silver badge

            Re: The brain

            That would be one large, and strong, pocket.

            1. I ain't Spartacus Gold badge
              Happy

              Re: The brain

              That would be one large, and strong, pocket.

              Some of us are blessed with big... pockets.

      2. MachDiamond Silver badge

        Re: The brain

        "Did the cat answer?"

        My last cat, Decibelle, was very talkative. The translation app didn't have a framework so I'm not 100% she was answering me or just issuing reports on my inadequacy as staff. When Louis asks Clinton a question, he often gets a response.

        1. The Oncoming Scorn Silver badge
          Thumb Up

          Re: The brain

          Pussy! Pussy, pussy! Coo-chee, coo-chee, coo-chee, coo-chee!

          Pussy want his fish?

          Nice piece of fish… pussy want it?

          Pussy not eat his fish, pussy get thin and waste away… I think.

          I imagine this is what will happen, but how can I tell? I think it’s better if I don’t get involved.

          I think fish is nice, but then I think that rain is wet, so who am I to judge?

          Ahh, you’re eating it. Fish come from far away - or so I’m told - or so I imagine I am told.

          When the men come - or when in my mind the men come in their six black, shiny ships, do they come in your mind too?

          What do you see pussy? And when I hear their questions, all their many questions, do you hear questions?

          Perhaps you just think they’re singing songs to you.

          Perhaps they are singing songs to you and I just think they’re asking me questions.

          Do you think they came today? …I do. There’s mud on the floor, cigarettes and whiskey on my table, fish in your plate, and a memory of them in my mind.

      3. I am David Jones Silver badge

        Re: The brain

        “Miaow”, said the cat. #TimDowling

    2. Doctor Syntax Silver badge

      Re: The brain

      Boris, just the man to answer this. How fast do your industrial machine activators get clocked compared with the speed of the predecessors driving them? AFAICS these guys are measuring the equivalent of the former.

  7. E 2

    I call BS

    How then can a person read, write, type at the speeds we do. How can a person play a video game or drive a car?

    1. I ain't Spartacus Gold badge

      Re: I call BS

      How then can a person read, write, type at the speeds we do. How can a person play a video game or drive a car?

      I think the idea is balls - but there's at least some method in the madness. We know that different bits of the brain work at different speeds. And the "outer" bits of the brain are doing a good deal of pre-processing of data - as well as bits of our brain doing stuff automatically.

      For example, after Stirling Moss's massive crash with head-injuries (he was in a coma for a month) he was able to get back to racing. Heard a great interview with him, where he said, "After my fatal accident. Oops, I mean my accident..." He said that he could still maintain the same lap times as before - but that he had to concentrate on the process of driving the corner, rather than doing it automatically. This level of concentration was too tiring and he wasn't able to keep it up for a whole race. And so he retired within a year of the accident. Which he said was a shame, because later - after another 6 months / year he was able to go back to driving automatically and only thinking about the tactics and strategy of the race - rather than how to get round the corner at optimal speed.

      So clearly, with sufficient practise, you can offload complex tasks to some kind of sub-routine that you've trainined to do them without burdening your conscious mind any longer. I gave an example above of a painist sight-reading music while also improvising on it, and holding a conversation with me simultaneiously. And that was while playing in public. I'm not a good enough pianist to do that, but I can improvise vocal harmonies while doing other tasks. Not holding a conversation though - that would be clever. That's an ability you have to teach yourself with much repetition (and quite a few horrible mistakes).

      I know that typing is also automatic. Because if I look at the keyboard, my conscious brain tries to take over typing and I slow down massively.

      There's clearly a lot going on in these head things of ours.

  8. Doctor Syntax Silver badge

    I started to run into trouble with the first example in the linked article: "this suggests that the thinker can access about 2^20 ≈ 1 million possible items in the few seconds allotted" Huh???? What's the logic behind this? Ah, I see. It makes a lot of assumptions about how the "thing" is selected and also about the actual number of bits communicated in a yes/no question.

    Very likely the thinker can only access a relatively few items when asked cold and most of the time will be spent in doing a lot of processing about what will be an answer which will be hard to guess and maybe also constrain the answer to be within the shared experience of thinker and guesser* Treating that as simple random access is going to seriously misrepresent what's actually happening.

    Also, even if the questions are posed as yes/no it's not necessarily easy to answer as yes or no. If, for instance the question is "Is it red?" it will be a good deal easier to answer if "it" is el Reg's banner rather than a terracotta pantile, somewhere between orange and red. Whatever the answer, the hesitation, intonation and facial expression of the answerer will convey more than one bit of information.

    What else? Typing speeds? How many bits are actually needed to select a letter? A good deal more than they seem to think givern the number of muscles that need to be controlled with considerable recision. In fact most.if not all the tasks they measure are input or output tasks and depend on the rate at which things happen in the external world. There's a limit as to how fast the fingers can move in typing, how fast a pen might move to produce legible writing. Speech recognition needs a lot of processing to turn sounds into meaning and, of course there's a limit on how fast they can be spoken with fast speech putting an extra burden on the listener to sort out the badly articulated sound.

    What may well be beyond measurement is the purely internal processing when problem solving. How many bits per second are involved in running through a lot of complicated ideas? How many bits is an idea?

    * If that isn't done the guesser will routinely not win. Zheng and Meister may be biologists but, as another biologist I would have no problem thinking about biological objects outside their experience - and, of course, vice versa

    1. MachDiamond Silver badge

      "Very likely the thinker can only access a relatively few items when asked cold and most of the time will be spent in doing a lot of processing about what will be an answer which will be hard to guess and maybe also constrain the answer to be within the shared experience of thinker and guesser* Treating that as simple random access is going to seriously misrepresent what's actually happening."

      "Muscle memory" is a huge clue that we do a lot of pre-processing. I can look at sheet music and play an instrument without much thought about where my hands/fingers/feet are going once I've spent the time learning the instrument. When driving, a stop sign doesn't have to be pondered. Driving a manual transmission car and knowing what gear and how to get there is ingrained. Getting in a new car is refinement of the motions I already know. I've even gone between my own car and a semi-truck without a problem as the sight, sound and smell of each loads the program to know how to work the gears.

      Since I've been doing a ton of photography over the last decade plus, I've also taken an interest in how humans process sight and it is in no way the same as how we program computer vision systems. Humans don't "see" everything all the time. Ever had a hard time finding your keys that were sitting in "plain sight"? You registered the table they were sat on well enough to prevent adding another bruise, but the stuff on the table didn't get processed into anything more than "lump", if that.

      1. Boolian

        Losing objects hiding in plain sight I think may be the clue.

        I gathered somewhere, that when looking for something it is a virtual image which is held (makes sense so far) and that most 'lost' objects tend to be at unfamiliar angles (remote down the side of a couch) so the improvement to visual searching, is to hold an altered image of the object in the mind, rotated about a different axis.

        Now, I cannot claim any empirical evidence - merely anecdotal - but I think to perceive that my success rate at finding objects is improved using this method, especially when I realise I am stuck in an aimless search loop.

        Perhaps the exercise in object rotation cycles more image options of places the object could be; if nothing else it seems to refresh the brain and F5 it out the aimless loop which inevitable cycles around the fridge - possibly the shelves of which are the last remaining flat surface + plan view object orientation pairings the brain can think of.

        What that suggests about 'Man Look' I don't know, but if this hypothosis has any merit, it does seem that Females appear to cycle through altered object orientations/ locations far better than the Male of the species.

        What relation that has to hunter/ gatherer, and spatial awareness dichotomies I have no wish to think about, because I've just realised where my shed key might be.

        1. MachDiamond Silver badge

          "I gathered somewhere, that when looking for something it is a virtual image which is held (makes sense so far) and that most 'lost' objects tend to be at unfamiliar angles (remote down the side of a couch) so the improvement to visual searching, is to hold an altered image of the object in the mind, rotated about a different axis."

          That's a lot of the same thinking I have. If I can't find something, I have to come up with different search tactics and the visual clues that might let me "see" the item a different way than I had been searching for it previously.

  9. Paul Hovnanian Silver badge

    Not just parallel ...

    ... but the brain is hierarchical. The sensory I/O certainly operates at much higher rates than 10 b/s (the authors quote a rate of 10^9 b/s). But specialized areas "distill" this raw data into symbols (concepts?) which are percolated upwards into the higher levels of consciousness. At the highest levels, we probably do think very slowly. Fortunately, the lower levels that do the input as well as output can handle rates equaling the tasks at hand. Sometimes, not even the brain, but lower level processing is done outside the brain in the bodies neural system. Some modes of motor control involving balance, for example, don't need to pass data through the brain. Dedicated connections in the spinal cord being principally involved in such tasks.

    Out highest level of thinking involves symbols that encode complex inputs to drive our conscious "state machine". And, for some of us, when those states are restricted to the small set of sleep, eat, defecate, procreate, it doesn't take a lot of bits to manage.

    1. HuBo Silver badge
      Windows

      Re: Not just parallel ...

      Yeah, thinking is hard! My current mental picture of such mind confusing brain function, aside from gyrus spirographs, involves on the one hand fast sensorimotor grey matter cortical columns (GPU pipeline-like?) and associated connecting white matter (CPO NOC-like?, with HBM?) that work super fast (say 10⁹ bit/s), and on the other hand, somewhere in there, the "inner brain" cognitive subsystem (higher? deeper? at the GPU-NOC junction?) that works super slow by comparison, overall. Luckily, reflexes (eg. knee jerk) don't quite enter that brain picture at all (easiest to produce).

      I wonder if their 10 bit/s figure could mean 10 binary decisions per second (eg. one every 100ms), spanning a space of 1024 possibles that are not necessarily related to each other (vs pick 1 of 1024 possibles, every second)?

      1. ArrZarr Silver badge
        Joke

        Re: Not just parallel ...

        So what you're saying is that the inner brain is functionally the CEO, with a high opinion of itself and little capacity to do anything useful, while the rest of the brain works incalculably more efficiently and quickly without getting any of the recognition?

        1. Paul Hovnanian Silver badge

          Re: Not just parallel ...

          "the inner brain is functionally the CEO"

          Or PHB, if that analogy is more apt.

  10. Bebu sa Ware
    Coat

    Musk as an Oracle?

    If Space Karen thinks or asserts something is, you can be pretty damn sure that it isn't.

    Worth remembering the human brain like every other biological system has evolved to deal with its environment.

    I would punt that a pre industrial world would not require a stream of consciousness much faster than something of the order of magnitude the authors are claiming.

    1. MachDiamond Silver badge

      Re: Musk as an Oracle?

      "Worth remembering the human brain like every other biological system has evolved to deal with its environment."

      And very slowly at that. The "look out, tiger!" reflex isn't that useful anymore for most of us, but it's still in there taking up space. I suppose it can be retrained to "cute tiger, I need to get a selfie" which isn't that good for survival.

      1. Neil Barnes Silver badge
        Holmes

        Re: Musk as an Oracle?

        Any organism merely needs to divide its environment into four classes and deal appropriately with each class:

        - things I can eat

        - things that can eat me

        - things I can have sex with

        - rocks

        Everything out of the first three classes doesn't matter unless and until they turn into one of the first three. It's we humans that make things so difficult for ourselves by sub-classifying rocks.

        1. Winkypop Silver badge

          Re: Musk as an Oracle?

          Or as the great Scottish philosopher William Connolly said: “Can I eat it while I shag it?”

          1. Ken Shabby Bronze badge
            Devil

            Re: Musk as an Oracle?

            Been there, done that

            1. MachDiamond Silver badge

              Re: Musk as an Oracle?

              "Been there, done that"

              I was raised better and wait at least 45 seconds after the shagging to go make myself a sandwich.

  11. J.G.Harston Silver badge

    I can certainly type a lot faster and a lot more accurately than I can convey infomation through speach.

    1. Neil Barnes Silver badge
      Headmaster

      Please tell me that was deliberate?

      1. sitta_europea Silver badge

        Maybe the typing was more accurate than the thinking.

  12. Ball boy Silver badge

    What are we measuring?

    Surely this depends on what we classify as a 'bit'? Defining a rate is only useful once we've established what it is we're quantifying and I'm not convinced we've done that properly yet.

    A 'bit' as a binary state certainly doesn't seem to fit: 10 b/s seems insanely slow for some processing and yet, if we take a whole word - assume reading as the baseline task - then a 'bit' can't be a whole word, either - if it were, changing the order of words as in the example above would slow us down significantly as there would have to be some kind of reassembly process required before comprehension.

    Then there's the parallel processing that seems to be inter-dependant. By way of example, I'm fairly sure I don't think consciously about how to strike the right keys in the right order: that's down to a sub-process that's been honed from many years of typing - but exactly which words I elect to use can't be down to muscle memory; if it were, I'd come out with the same tripe each and every time. There's some rather clever time and priority management going on here, clearly.

    However, this parallel processing only goes so far: Imagine you're driving down the road and happily singing along to something on the radio. Your brain is coping with the routine processing these disparate tasks require (spacial awareness and lyric recall to name but two obvious ones) and manage the very different timing each one requires and yet, were a child to suddenly appear in the path of your car, I'll bet you stop singing.

    1. David Hicklin Silver badge

      Re: What are we measuring?

      > yet, were a child to suddenly appear in the path of your car, I'll bet you stop singing.

      That's the Non-Maskable Interrupt (NMI) being triggered by the fight or flight reaction along with a hefty dose of adrenaline (overclocking). Much like my horse when confronted by a flock of sheep stampeding down the lane....he hates them!

    2. Evilgoat76

      Re: What are we measuring?

      We seem to store memories and information via association so maybe BIT it the wrong work here. It makes everyone in information technology and hardware think of a logical bit.

      Object or Symbol may be better terminology. A word may represent just a word, ie. "Bit" but with the attached memories it closer to an object with a root class and many properties. So Banana is one bit, but inferences many, many more things. Its a fruit, it's yellow, I like the taste, they are of class "plant" also "food". There are methods for it too such as "eat" and "how to stop spelling it"

      So while yes, technically Banana is one "bit" it does not represent a singular thing in our minds.

      Right, I need Calvados, way too deep for this time of night

  13. nfss

    Throughput

    Let's say I have a program that calculates all the sales, costs of goods, salaries, maintenance, depreciation, interest, taxes, etc for the past year, and spits out a result as to whether or not the net operating profit exceeds a certain percentage.

    I type "run" and hit enter, all 16 cores (32 threads) at 4.7 GHz pinned at 100%, and after one second it spits out the answer "no"

    By the methodology of these "boffins" throughput is 42 b/s*

    I also have a one line concatenated basic program (fits in 132 charcters on lovely green and white paper) that uses a RND function embedded in a print statement with a delay to produce a similar output. Construction left to the reader as an excercise.

    I type "run" and hit enter, there is no measurable activity, and after one second it spits out the answer "no"

    By the methodology of these "boffins" throughput is 42 b/s*

    Somehow I don't think the measurements are a useful analysis of the underlying occurrences.

    Please note, the answer is always 42, you simply have to ask the right question.

    * We were limited to 7 bit ASCII loaded via yellow paper tape around these parts then, Yorkshire men at will...

    1. Amblyopius

      Re: Throughput

      Actually they would claim it is 1b/s as the only possible answers are yes and no. They calculate the rate based on the minimum amount of bits required to enumerate all possible answers. Hence how 1 million objects and all of their characteristics magically only represent 20 bits.

  14. anonymous boring coward Silver badge

    I figured out that this is a load of nonsense in just a few seconds. And I wrote this faster than 10 bits per seconds.

  15. anonymous boring coward Silver badge

    I agree, but I'm just curious how you get to 21 bits per letter? I.e "no" being 42 bits? (Not that the brain uses ASCII...)

    1. nfss

      Throughput

      Throughput includes the input as well as the output, so six total characters.

  16. duggzdebuggz
    Happy

    How does one measure a thought?

    For something to be scientific, then it needs to have the characteristics of being measurable and repeatable. So, how does one measure a thought? Now here we are not counting chickens, but thoughts. Think a thought, put it out of your mind, think of another thought, put it out of your mind. As one does this, count the thought. So, if one can count ones thoughts, it is measurable. However, can one also measure its size, its shape, its depth, how long it lasts for, where it comes from and where it goes, and what it contains, and how does it align or get submerged or incorporated or deleted or changed either on its own, or when included with other thoughts, or made part of other thoughts? When that simple question can be answered...then maybe one can start to talk about the science of thought and thinking...Until then dream on and use your infinite imagination...Simples

  17. xyz123 Silver badge

    The human brain operates at 10bits/second. So STILL faster than AT&T and Comcast 'premium' broadband then.

  18. Amblyopius

    The amount of nonsense in this paper is remarkable

    All quotes are from the actual paper.

    - "If forced to type a random character sequence, their speed drops precipitously."

    Someone will have to tell these people that keyboards have a certain lay out for a reason and that subsequently it's not physically possible to type random sequences as fast as sequences in the targeted language of the layout. Random character sequences are bound to be inefficient and hence slow you down.

    - "If the guesser wins routinely, this suggests that the thinker can access about 2^20 ≈ 1 million possible items in the few seconds allotted. So the speed of thinking – with no constraints imposed –

    corresponds to 20 bits of information over a few seconds: a rate of 10 bits per second or less."

    This is of course ludicrous. In the most efficient storage method possible we only need 20 bits to enumerate 1 million items, so as you've supposedly only generated 20 bits of relevant data, the "speed of thinking" is 20 bits/s?!?

    - "This dilemma is resolved if I give you a specific task to do, such as typing from a hand-written manuscript. Now we can distinguish actions that matter for the task from those that don’t. For example, two different keystrokes are clearly different actions. But if you strike the key in 91 ms vs 92 ms, that variation does not matter for the task. Most likely you didn’t intend to do that, it is irrelevant to your performance, and it is not what I want to capture as “richness of behavior”."

    Probably one of the most blatant admissions of cherry picking possible. Why not just measure something where a lot of combinations of factors DO matter? E.g. tell me what the bitrate is for Chopin's prelude in F# minor (Opus 28 #8). How did they get to the point where they expected "fair" measurements of speed by crafting very biased tasks? Note: of course even in typing a lot more things do actually matter than just hitting a key. For example the amount of force required and how deep the key travels has quite a bit of impact on a satisfactory end result.

    - "So the discussion of whether autonomous cars will achieve human level performance in traffic already seems quaint: roads, bridges, and intersections are all designed for creatures that process at 10 bits/s. When the last human driver finally retires, we can update the infrastructure for machines with cognition at kilobits/s. By that point, humans will be advised to stay out of those ecological niches, just as snails should avoid the highways."

    Given how poorly cars perform in traffic, they might want to hold off from having cars perform in an environment that has entropy requiring for "human level" kilobits/s.

    All current attempts at creating AI are clearly also failures. Cause we can for example measure an LLM by it's ability to produce tokens and translate that to bits/s. To do this it reads from memory at hundreds of GB/s and needs to do trillions of calculations but that's entirely not relevant based on how the paper measures things. And sure enough, if you throw enough at it, it can outperform a human in some tasks but that's before we consider efficiency. The estimate for the human brain is that it uses 20W a day or on average less than 1W an hour. Good luck getting anywhere near human performance at 1W an hour.

    How did anyone read this paper in advance without suggesting a bit of a rethink before publication?

    1. doublelayer Silver badge

      Re: The amount of nonsense in this paper is remarkable

      I think you're mostly correct, but a few points:

      "Someone will have to tell these people that keyboards have a certain lay out for a reason and that subsequently it's not physically possible to type random sequences as fast as sequences in the targeted language of the layout."

      I don't think this is why that difference exists. Most keyboard layouts have had a little thought put into them, but yet Dvorak, for all its adherents, doesn't actually speed typing up compared to the less thought-out QWERTY where avoiding lever jams was at least part of the consideration. I think the main differences is that I have a lot of muscle memory. When I want to type "the", I can call on a long history of typing those letters in that order without needing any conscious thought to do it correctly every time. If I have to type "ymx", I don't have that memory. Also, it is common to look at the screen while typing if only by reflex to verify that you have typed things correctly. That's easy when you're copying words, because you can remember the words for a short time. Let's see, I was typing about types of apples, the words say "green-skinned apples often used in", sounds good. If I have to copy random letters from something I'm reading, looking back and forth from one window to another is challenging because the random letters have not been memorized and I'm either comparing them manually, which is slow, or flicking back and forth for by reflex without being able to use that reflex properly, which is going to cause delays as the brain attempts to do something that it cannot do. If we created a keyboard layout by dumping all the letters into the space for letter keys at random, but we then had people only use that layout for years, I think they would be similarly fast while typing languages they know and similarly slower when typing random letters.

      "The estimate for the human brain is that it uses 20W a day or on average less than 1W an hour."

      Watts are already energy over time; you can't divide them like that. The brain tends to consume about 1.7 megajoules of energy in a day, meaning its average power consumption is 20 W whether it's per day or per second. Which means that you can in fact get a computer that operates with less power, and quite easily. It will be a lot more power-efficient than the brain when solving mathematical problems, but most of the tasks we want to perform daily can't be done at all or anywhere nearly as efficiently on such a machine.

      1. Amblyopius

        Re: The amount of nonsense in this paper is remarkable

        I first learned how to type almost 40 years ago on a typewriter and got a rehash 35 years ago where the idea was to lift us up to professional speeds. We had to do a lot of things at speed including copying "gibberish". And speed definitely is affected by letter distribution. If you were to concentrate all the core sequences of English in the central part of a keyboard (across the 12 central letter keys) the typing speed of a pro would drop quite considerably.

        For the Watt part, I did indeed make a mistake as I started from a source that stated 20W per day and I converted it to Wh as that made more sense as a unit (I find the use of per x a bit ambiguous given the existence of Wh). I should've indeed looked for a source stating Kcal or MJ.

        Still, a fairly moderate GPU will consume 200Wh. And yes, a computer rated 20W can do math faster but then we fall in the trap of doing apples and oranges again. If we go back to the LLM example, know any 20W rated computers (hence consuming 20Wh at full throttle) that will be competitive? Neither do I. And we're being generous here as our brains actually do not dedicate anything near 20Wh to cognitive tasks. So even after a 24x correction the statement actually stands.

        1. doublelayer Silver badge

          Re: The amount of nonsense in this paper is remarkable

          "If we go back to the LLM example, know any 20W rated computers (hence consuming 20Wh at full throttle) that will be competitive?"

          I can find you a computer that can run in 20 W which can run an LLM, sufficiently quantized, at an output token rate that matches slow human speech. The output you get will be only slightly more crap than most other LLMs. I don't think that's any better comparison to a brain, though. I've known people who can make up plausible fiction on the spur of the moment and keep track of everything, and those who can't string the truth together well if they're asked to do it quickly.

          I think whatever we choose, there's not going to be a good parallel to what a brain does. Our limited understanding of how the brain does what it does is part of the problem, and assumptions about it is how we get papers like this that oversimplify things to the point that we can already realize how improbable it is without trying to duplicate their observations.

          On the topic of typing gibberish, I'm surprised that you could type it as quickly as a language you speak. I certainly could not, though I type quickly in multiple languages. I somehow doubt that their research participants had been trained in the way you were, though, so I still think factors other than letter distribution were strong factors explaining the speed differences.

  19. Dwarf

    Male or female ?

    Is this based on a male brain or a female brain ?

    We all know that a woman can think about hundreds of things and hold parallel arguments on many topics at the same time.

  20. Gary McKinnon

    Bollox

    In the waking state we are processing 5 complex senses, manouvering our bodies through an extremely noisy (data-wise) 3D environment, speaking to others and thinking about multiple things at once.

    This dinosauric view that consciousness can be compared to computers is thouroughly outdated and unfruitful.

    Man is not machine and machine is not man.

  21. TRT

    And apparently there's a non-maskable interrupt every 6 minutes.

    For men anyway. Probably women too, if we're being honest.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like