back to article Google Brain king slashes cost of AI gear

The architect behind some of Google's mammoth machine-learning systems has figured out a way to dramatically reduce the cost of the infrastructure needed for weak artificial intelligence programs, making it easier for cash-strapped companies and institutions to perform research into this crucial area of technology. In a paper …

COMMENTS

This topic is closed for new posts.
  1. btrower

    Portable source code

    Portable source code or GTFO.

  2. Don Jefe
    Alert

    Face

    WTF! I hope this thing doesn't think all humans look like Clawhammer Jack up there in the image. If it thinks all humans look like a wet rape clown then its first decision will be that we've all got to go.

  3. Trevor_Pott Gold badge
    Mushroom

    Thou shalt not make a machine in the likeness of a man's mind.

    It is nearly time for the Butlerian jihad.

    1. Anonymous Coward
      Coat

      ...in the likeness of a _man's_ mind

      It isn't until it has been trained to recognise cleavage ..

    2. Destroy All Monsters Silver badge
      Trollface

      The Butlerian Jihad has been postponed as designers fight over how to design the icon to activate said Jihad on you smartphone.

  4. Anonymous Coward
    Anonymous Coward

    So... it fails to recognise slightly more than one in four cat videos. Humans are still better than AIs at wasting time on the internet.

  5. dave_sullivan
    Terminator

    Google's not the only company building deep neural networks...

    I built a PaaS that does exactly this (www.ersatz1.com) There have been a lot of advances in neural nets, starting in 2006 (deep learning is now 7 years old!)

    But yeah, Google seems to be focused on building bigger and bigger networks, but latest research is actually suggesting that "bigger isn't necessarily better" (see: http://arxiv.org/pdf/1301.3583v4.pdf) Our system takes the approach of training a bunch of smaller networks (although still huge compared to what people were doing just a few years ago) and stringing them together as modules, more like how our brains actually work.

    Anyway, don't worry, this stuff is a looooong way from skynet... Expect to see a lot more growth in this industry, google is not the only game in town--although they do have by far the best PR team ;-)

  6. Nicholas DeWaal

    "Anything CPU can do GPU can do better" is BS. GPU's are better for high throughput needs, and CPU's are better for low latency needs.

  7. Michael Wojcik Silver badge

    A tad oversold

    Machine learning is an incredibly powerful tool... Oh, it's quite credible, I assure you.

    "Google researcher describes CPU-GPU architecture for large neural nets" would have been an appropriate title for this piece. There's nothing amazing or even particularly innovative here. GPUs are not a panacea (as DeWaal points out above), and bigger neural nets aren't either (as Sullivan points out). And machine learning encompasses far more strategies than just neural nets.

This topic is closed for new posts.

Other stories you might like