Portable source code
Portable source code or GTFO.
The architect behind some of Google's mammoth machine-learning systems has figured out a way to dramatically reduce the cost of the infrastructure needed for weak artificial intelligence programs, making it easier for cash-strapped companies and institutions to perform research into this crucial area of technology. In a paper …
I built a PaaS that does exactly this (www.ersatz1.com) There have been a lot of advances in neural nets, starting in 2006 (deep learning is now 7 years old!)
But yeah, Google seems to be focused on building bigger and bigger networks, but latest research is actually suggesting that "bigger isn't necessarily better" (see: http://arxiv.org/pdf/1301.3583v4.pdf) Our system takes the approach of training a bunch of smaller networks (although still huge compared to what people were doing just a few years ago) and stringing them together as modules, more like how our brains actually work.
Anyway, don't worry, this stuff is a looooong way from skynet... Expect to see a lot more growth in this industry, google is not the only game in town--although they do have by far the best PR team ;-)
Machine learning is an incredibly powerful tool... Oh, it's quite credible, I assure you.
"Google researcher describes CPU-GPU architecture for large neural nets" would have been an appropriate title for this piece. There's nothing amazing or even particularly innovative here. GPUs are not a panacea (as DeWaal points out above), and bigger neural nets aren't either (as Sullivan points out). And machine learning encompasses far more strategies than just neural nets.
Biting the hand that feeds IT © 1998–2021