back to article Brain-inspired chips promise ultra-efficient AI, so why aren’t they everywhere?

Every time a chipmaker or researcher announces an advancement in neuromorphics, it's inevitably the same story: a brain-like AI chip capable of stupendous performance-per-watt compared to traditional accelerators. Intuitively, the idea makes a lot of sense. Our brains are pretty good at making sense of the world, so why wouldn …

  1. Hull
    Pint

    Thanks for the update

    I look at the state of spiking neural network hardware development every few months, you saved me an hour or two.

    1. Filippo Silver badge
      Pint

      Re: Thanks for the update

      Seconded! I always find neuromorphic chips very interesting, and I often wonder why we don't see them in actual use.

    2. LionelB Silver badge

      Re: Thanks for the update

      One thing puzzles me - should I understand "neuromorphic chip" as in spiking neurons, as opposed to, say, a chip which implements e.g., a sigmoid transfer function close to the metal (as well as large-scale connectivity)? And if so, why? it may be Nature's Way, but that does not necessarily imply it will be the "best" way to implement ML/AI technology on silicon - especially seeing as (i) we do not have a deep understanding of how biological neurons achieve functionality through spiking, and (ii) most large-scale neural models are not based on a spiking design and implementing learning/training with spiking neurons is harder (and slower!)

  2. Anonymous Coward
    Anonymous Coward

    The mid-end is interesting too

    The situation at the low- to mid-end of the spectrum looks more dynamic. Looking at companies like Brainchip, GrAI, Innatera, ... one might get the feeling that neuromorphic could got the RISC-V way i.e. start small (e.g. embedded computer vision) and make its way towards more high-performance applications.

    Some point to automotive being one of the higher potential markets (likely due to the increase in cameras on L2/L3 vehicles) but that remains to be seen (see the point made in the article about current architectures developing fast enough in terms of performance and performance/watt).

    IoT/the industrial sector might be a lower hanging fruit. In this sense one would bypass the problem of programmability, as neuromorphic chips hit the market as part of a final product. As an example one can look at Prophesee.

  3. Danny 2

    I, for one, have given up on puns

    Please tell me you lot have thought through the implications. You scared Stephen Hawking and he wasn't scared by black holes.

    If you are going to train them then start them on the Iain M Banks Culture novels. Oppenheimer regretted too late.

    1. Anonymous Coward
      Anonymous Coward

      Re: I, for one, have given up on puns

      The only "Post scarcity" fiction I've seen that has a sensible explanation of how scarcity isn't deliberately maintained by the rich and powerful.

      1. Paul Kinsler

        Re: "Post scarcity" fiction

        At a tangent, and whilst touring through various bits of creaky old SF on gutenberg, I recently found the "Venus Equilateral" stories, which at one point have a 1940's go at post scarcity. But only after quite a lot of electromagnetic derring do, naturally.

        https://www.gutenberg.org/ebooks/68008

  4. John Smith 19 Gold badge
    Coat

    If they are inspired by the brain, how are you "programming them"

    Because you don't program brains, you "teach" them.

    Until they reach a critical mass and are able to learn for themselves.

    Mines the one with the copy of Carver Meads "Analog VLSI Implementation of Neural Systems" in the pocket.

    1. LionelB Silver badge

      Re: If they are inspired by the brain, how are you "programming them"

      Don't rush for your coat - you hit the nail on the head!

      Real brains are "programmed" by tens of billions of years of evolution - programmed to learn very, very well indeed. We currently have no idea of the design principles behind that (beyond variations on backprop on multilayer, usually feed-forward networks).

      1. John Smith 19 Gold badge
        Unhappy

        Real brains are "programmed" by tens of billions of years of evolution

        Not entirely true of animals, definitely not true of humans.

        Some animals certainly learn things from their parents while humans have this thing called "language"

        AFAIK unlike computer memory human memory results in physical changes to brain structures (not just chemical levels) so language literally re-structures the brain.

        The real question is how does a large number of neural networks (as shown by fMRI) linked together in various ways turn from a pattern recognition engine (kitten/not kitten) to a "thinking" machine ?

        My instinct says the idea in the novel "Snowcrash" is right. The brain has evolved a micro-machine architecture. I don't think the

        language/physical behaviour --> brain re-structures --> learns how to do new things (including make new language strings it's never heard before) paradign is powerful enough.

        But it'll take someone way smarter than me to figure out how it works. :-(

        1. LionelB Silver badge

          Re: Real brains are "programmed" by tens of billions of years of evolution

          I don't disagree. When I said brains are "programmed to learn" (okay, I should have caveated animals which don't do much lifetime-learning), I am certainly happy to include the propensity for language under that. Perhaps "structured" or "organised" to learn would be a better phrase, with "learn" intended in the broadest sense.

          And yes, human (and many other animal) brains do "re-wire" in the course of not just memory/learning, but cognitive functioning in general - it's called neural plasticity. Of course that mechanism itself evolved.

          And sure, we're very far indeed from figuring out the mechanisms and organisational principles which underpin "thinking".

          (FWIW, I am not a neuroscientist, but I do get to work with neuroscientists - my day job involve developing mathematical methods, particular information-theoretic, to analyse neurophysiological data such as fMRI, M/EEG, etc., with the goal of gaining a better understanding of brain organisation and function. Disclaimer: that does not necessarily imply I am smarter than you ;-))

    2. Tom 7

      Re: If they are inspired by the brain, how are you "programming them"

      Brains have evolved so they born with structures that can perform certain tasks very efficiently, evolution has also managed to work out ways so that their start up state is such that it will probably start to perform its intended task from 'birth'. Even small brains (<200 neuron ones from nematode worms) are capable of learning from the off and reproducing them in code can achieve surprising results using simple training methods.

      The programming of brains is both structural and the setting of initial states somehow - chicks dont learn to eat they learn to eat effectively being programmed to eat, move and later on fuck and look after their young but all animals seem to have a degree of pre-programming (which may be purely morphological for some requirements) and others actions are controlled by hormones and I'd bet there's bits of ROM all over the place too. Epigenetics suggests things can be programmed by inheritable changes to methylisation of genes. Brains do learn, and how, but their is far more programming in there at bootup than simple* PhD students can get a grip on. The mining of the existing brains and nervous systems is far bigger a task than any company should be allowed to exploit (and patent FFS) and really needs something like an open source version of the genome project

      1. Peter Gathercole Silver badge

        Re: If they are inspired by the brain, how are you "programming them"

        This business of thinking that there is no pre-programming in organisms really puzzles me.

        Take the situation of an animal that has been reared outside of it's natural environment, using techniques so far removed from nature, and although there may be hiccoughs, they often are able to breed and nurture young of their own, even though they never had a chance to 'learn'.

        Or similar things with migrating birds. Reared in isolation, they still have an urge to migrate. Or even learn to fly by themselves.

        Just take the initial behavior of a newborn to search for it's food source for the first time (esp. marsupials - how do they know to search for the pouch!)

        It's all very well to say it is a conditioned response to stimulus, such as searching for the smell of milk, but surely even a conditioned response needs to be conditioned somehow?

        There must be a degree of stored "program" somewhere in the building blocks of life, probably the various sorts of DNA, otherwise how can it happen?

        On a counter note, I've recently been thinking that DNA is less a pattern of how something will eventually look, and more like a description about how things will develop. It's a process description. The final product (the organism) is a result of the process, not something building to a blueprint. And evolution is this build process becoming corrupted, together with a natural filter to cull the deviations that lead to non-viable or less successful results.

  5. Pete 2 Silver badge

    Don't feed your AI any cheese

    > a brain-like AI chip

    The only problem being that when you "sleep" the machine, it starts to dream.

    Just like when I put my phone into low-power mode, it keeps making noises like battery powered woolly grass-eaters.

    1. Tom 7

      Re: Don't feed your AI any cheese

      Dreaming is just part of the data processing engines: Rats trained to run around mazes seem to dream of running and re-running the mazes to improve their efficiency. And they do perform much better the next day!

  6. Sam Adams the Dog

    The obvious answer is ...

    --- that a simulation of a brain cannot be better than a brain, and most brains are not very smart.

    1. katrinab Silver badge
      Meh

      Re: The obvious answer is ...

      Depends what you mean by "smart". For example object recognition is something we find really easy to do, but computers struggle with it.

      Think of the xkcd example of the smartphone app that recognises if you are in a National Park (easy for a computer, difficult for a human), and if you are taking a photo of a bird (easy for a human, very difficult for a computer).

  7. Anonymous Coward
    Anonymous Coward

    Brains occupy 3 dimensions and chips only 2.

    1. JTatts

      The brain is actually more like a flat pancake that has been folded up to fit in the skull.

      1. LionelB Silver badge

        Well, flat-ish. The cortex is organised in "columns".

  8. Schultz
    WTF?

    But what is it good for?

    After reading the article, I was left wondering: But what is it good for?

    In my mind, the efforts to simulate the brain in silico is based on old-fashioned transistors clicking away while simulating neurons. But apparently, the idea of neuromorphic chips is to leave those transistors unpowered until a stimulus comes along (see: Nature article on the topic), hence the immense power savings. Clearly, this is only useful when you assume that your computer is going to sit idle most of the time and that you fail to power down while idling. The other touted advantage is highly parallel computing.

    The big question then becomes: is it helpful to try to model a brain in order to gain advantages from parallel computing and power-down upon idle, or might one just optimize parallel computing and dynamic power use? The former is clearly advantageous when trying to obtain research funding (addressing fundamental questions), while the latter sounds more promising if you want to develop and sell actual silicon. So my prediction would be that brain-inspired chips will remain an active research area for the foreseeable future, without ever moving beyond the stage of a investors' fad. But I am sure that every neuromorphic brain in this world would like to contradict me on that.

  9. anonymous boring coward Silver badge

    Are we nearly there yet?

    Those investors are not stupid...

  10. John Smith 19 Gold badge
    Unhappy

    For power reductiong the big thing is going clockless

    The Manch U AMULET project under Steve Furball for a clockless 1st generation ARM demonstrated this decades ago.

    However then you cannot do chip differentiation by clock speed (you could say all chips have a clock frequency of 0 Hz) which chip mfgs are unhappy about doing.

    Asynchrounous chips were also trickier to design (but the design tools have gottent better) and normally used more transistors (although I think this has also gotten lower over time).

    So yes, better options exist if you want to do that. OTOH if you wanted to get brain like power consumption (IIRC a brain runs several Petaflops of processing on about 400W, so all those "Electronic brain" stories of the 1960's were complete BS) there are already other ways (Carver Mead's team used a stand CMOS process in non-standar ways to get highly non-linear signal processing, which is necessary as it turns out is most human input signals have enormous ranges)

    1. katrinab Silver badge
      Meh

      Re: For power reductiong the big thing is going clockless

      The entire human body runs at about 60-100W. The brain is a fairly large proportion of that, but nowhere near all of it.

      Humans are really bad at "floating operations", so you would be lucky to get 1 FLOP never mind trillions of them.

      1. Tom 7

        Re: For power reductiong the big thing is going clockless

        The brain runs on around 20W and even thinking really hard doesnt increase the load much though its obviously pretty hard to work out its power consumption as stressing the brain excites the body as well though fMRI scans indicate the bits that are running hot at any one moment they can as yet measure calorifically I believe.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like