De-Hype
The problem I have is not the article, but the generic use of the term AI across the media. Use of it immediately discredits virtually any article, as we're not even close to AI yet. Like endless power from nuclear fusion, it's this year's hype phrase, banded about by journalists and CEOs, without the faintest clue as to what it means or how far we are from it. Just like Big Data, Hadoop, Prince, etc etc before it. It's even on Gartner's 2017 hype curve, far on the left hand side under General Purpose Machine Intelligence.
Machine learning, data science and predictive analytics to usefully employ as a tool alongside people to make money/cut costs/improve productivity? Yes. Although all of those are still in their infancy in terms of being incorporated into deliverable systems. But AI? No, it's still in the pre-Asimov era, we're not even close, in all but a handful of specialist, well defined skills.
Just on a technical inaccuracy: You don't need C/C++ coders to take advantage of GPU cards for machine learning. Sure, you can use it, just as you can use assembler to write your own word processor, if you have a few decades and a bottomless pit of cash. The better alternative is to use Google's Tensorflow, and Theano, both high-level, optimised, broad functionality machine learning libraries with a variety of models for different tasks. These libraries will talk to GPU cards via their CUDA drivers, removing the need for C++ people to unproductively write tedious less time productive code. The hard work has already been done. Python and Tensorflow (with maybe Keras sat between as a high level framework) is mainly where it's at both commercially and academically, with R and Theano also up there. But not C++, unless you want to hand craft or optimise the low level code of your own ML libraries and credibly believe your developers are better at both C++ and machine learning than Google's/MILA's engineers).