back to article DARPA funds Mr Spock on a Chip

The US Defense Advanced Research Projects Agency financed the basic research necessary to create a processor that thinks in terms of probabilities instead of the certainties of ones and zeros. And now Lyric Semiconductor, the spin-off from the Massachusetts Institute of Technology where the work was done, is going to spend the …

COMMENTS

This topic is closed for new posts.
  1. amanfromMars 1 Silver badge

    For the Old System, New World Order Head Cases ... Vital Viral Bits which will cost them Dearly

    "In the IT racket,....... "

    Would you care to wager a bet that next there will be Future Memory available to Prime Customers.

    Or is that already a patent Special Application of Neuro-Linguistic Programming being supplied? Provide that very particular and peculiar algorithm and Global Operating Devices are the new Lauds, although whenever you get right down to IT, the new Bards is probably also accurate and appropriate.

    Who wants to be a Trillionaire? For that is what IT offers Smart Grid Grade Dealers.

    1. tony trolle

      seems like a waste of money.....

      DARPA should just ask the manfromMars

  2. Paul Crawford Silver badge

    Neural network anyone?

    From this article it sounds a lot like the sort of semi-analogue implementation of neural network blocks, with weighted inputs making the decisions for subsequent steps, and one 'neuron' in effect replacing a whole lot of multiply/accumulates and the simple end threshold step. Anyone know more?

    1. Destroy All Monsters Silver badge
      Terminator

      I guess they are implementing the "Bayesian Network Tree"

      These are more complex than NNs and also more expensive to compute, at least on symbolic machines, as you have to repeatedly roll the probabilities up and down the tree and this involves some heavy floating-point multiplication and division.

      Of course it might be something else entirely.

  3. Denarius Silver badge
    Happy

    old is new ?

    Analogue computers are back ?

  4. Erroneous Howard

    How long?

    ...before we see an improbability chip? Then all we need to do is plug it in to an extra hot cup of tea and we can invent the improbability drive, which will revolutionise space travel.

    After all, the only thing you really need to know is exactly how improbable an event is....

  5. Anonymous Coward
    Unhappy

    Mmm...

    ...target aquired. The processor says it's 99.999% chance of being friendly.

    Well that's 0.001% chance it isn't "Open Fire!"

  6. Graham Bartlett

    Blast from the past

    Wow, they've just reinvented Fuzzy Logic. Excuse me if I wonder if they've done even the crudest check into what's gone before.

    1. Anonymous Coward
      Anonymous Coward

      I did a crude reading of the article

      "Digital logic that takes 500 transistors to do a probability multiply operation, for instance, can be done with just a few transistors on the Lyric chips. With an expected factor of 1,000 improvement over general purpose CPUs running probability algorithms, the energy savings of using GP5s instead of, say, x64 chips will be immense"

      I didn't get the impression that they were claiming to have invented fuzzy logic, just that they had invented some hardware to optimise fuzzy logic operations.

      1. Charles Manning

        It's called an op-amp

        Invented 50 or so years ago.

        As numerous has written analog[ue] computing is as old as forever. Slide rules are analog[ue] calculators invented over 300 years ago.

        There are quite a few vendors making mixed analog[ue]/digital FPGAs etc. that can be used to perform fuxxy logic/ neural nets etc.

        Of course relative to running x64 chips, anything is an enormous power saving. Just recompile for ARM and you'll save over 95% of the power.

        Looks just like someone found a good way to write a DARPA funding proposal.

  7. Pete 8
    Heart

    yummy

    accelerated heuristic appreciation of aggregated input, speeding up emergence of goodies for participants to act upon - intelligently.

    I could use some of that in my pie.

  8. ben 29
    Thumb Up

    open fire!

    "Well that's 0.001% chance it isn't "Open Fire!""

    Which is a much better chance than our boys currently get against the crack USAF at the moment...

    B.

  9. Pascal Monett Silver badge

    A probability processor ?

    Calculating the probability of something requires extensive knowledge of the factors that come into play, as well as their relative importance.

    Referring to Mr. Spock, it has been demonstrated time and time again that, faced with a totally unknown phenomenon (the staple of T.O.S. episodes), you simply cannot state any probabilities whatsoever with any sort of scientific basis.

    The processor is not going to be the thing defining the factors, now is it ? So how exactly is this "probability processor" going to work ? If it has to be fed all the parameters - which is the only way for it to do any statistical analysis - then there's a human that will be doing the work and the "probability" processor will just be a calculator, like any processor today.

    It must be more complicated than all that, since there are so many much more intelligent people than me working on that kind of thing, but I'd like the description to be at least something I have a chance of understanding.

    1. Charles 9
      Boffin

      That's not the issue.

      They're not saying that their probability processor can do things the digital chips can't. They're simply saying that their processor design can perform probability calculations ORDERS OF MAGNITUDE faster and more efficiently than digital processors. This is because it is looking at the probability AS A PROBABILITY. In traditional math, probabilities have distinct steps for handling differing cases of probability (compounded probabilities and so on), but all of them involve calculations that are not necessarily ideal for a digital processor: especially if those probabilities cannot be expressed in rational terms (which can easily happen if they're calculated based on statistics). These "PPUs", I suppose you could call them (Probability Processing Units) are custom-built for such calculations, much as GPUs are specially designed for fast parallel processing of discrete data (the kinds of calculations needed for 3D modeling).

  10. Trevor_Pott Gold badge

    Sounds cool!

    Fuzzy Logic on steroids, in silicon.

  11. Torben Mogensen

    Precision?

    If probabilities are represented as levels of voltage or charge, I can't imagine any significant precision -- especially if you have lengthy computations. I suppose they can make multiple "samples" to increase precision, but you would quickly lose the speed advantage.

    Still, it is an interesting idea, and for "fuzzy" applications where you don't need any great precision, it sounds ideal.

  12. BlueGreen

    @Torben Mogensen

    Well put, but if you don't need much precision then 16-bit (or even shorter) fixed-point should be just fine. It wouldn't surprise me if they compared performance of IEEE long floats to that of their creation to make it look 500 more rockin'.

  13. DavidMcDaniel

    Mr

    And since AFAIK the fuzzy logic fad faded into obscurity, why do we think this will fare any better?

  14. John Smith 19 Gold badge
    Boffin

    Yes does sound like processing in the linear region.

    Analogue computers using up to 500 op amp modules were used in the 1960s for things like flight control simulators (new aircraft ahndling characteristics *before* you've built your new aircraft) and modeling rocket engine start up transients. Their dynamic range IIRC was around a few mV to +/- 10V so *roughly* 14 bits of linearity.

  15. Anonymous Coward
    Headmaster

    Pedantry

    1. Fuzzy logic and probability are different things. Fuzzy logic measures the degree to which something is representative, i.e. the degree to which you are tall. Probability measures the likelihood that something is true, i.e. whether it is currently raining. The difference is largely semantic, however both methods can coexist in the same system.

    2. 1,000 trillion is 1 quadrillion.

    3. This appears to only be capable of processing probability mass functions, not the more complex probability density functions. Probability mass functions require a discreet environment, i.e. die rolls and binary results. Probability density functions operate on continuous domains and are thus better suited to complex real world problems. As an aside, artificial neural networks can be configured to work in either situation. Real neural networks are discreet however.

  16. Trevor 3
    Go

    All we need to do

    Is to use our certainty computers to work out how to build a probability processor, then we can work out how probable any outcome is and move everyone's underwear 3 feet of where the owner is. This will make certain physicists popular at parties.

    Then with a fresh cup of really hot tea we can work out the probability of an infinite improbability drive is, and kaboom!

    I would start by feeding in the Islington area's Phonebook first though.

    And beware the mice.

  17. Anonymous Coward
    Boffin

    What this is for...

    Is stuff like image recognition and robotics. If you want to know more, check out Sebastian Thrun - "Probabilistic Robotics". His teams won the Darpa Grand Challenge, but as I recall those cars were filled with a shed-load of computing power. Far better to throw in a laptop with a probability co-processor.

    http://robots.stanford.edu/

  18. someone up north
    Linux

    this does not compute

    Mr Spock on a Chip ?

    this does not compute , Spock is all about logic deduction,

    whereas the research is about randomness+ logic

  19. John Smith 19 Gold badge
    Joke

    Fifty patents issued. Effectively a whole *new* technology branch

    That is illogical, Captain.

  20. Michael Wojcik Silver badge

    Reg commentard^Wcommenuises strike again!

    Ah, if only the members of Vigoda's PhD committee, and everyone who participated in his dissertation defense, were as knowledgeable and clever as Reg readers. They'd have spotted his feeble reinvention of analog computing / fuzzy logic / &c right off the bat.

    I assume, of course, that those dissing the GP5 chip here have read Vigoda's dissertation and verified that he has not in fact contributed anything new. Or could it just possibly be that they are full of crap?

This topic is closed for new posts.