Re: "my dad invited expert systems".
The thing is that processing power is a red herring. As the author of this article said, they once thought that you'd need hundred of megahertz worth of performance to create an AI.
We can now buy processers mounting three gigahertz processors (with eight seperate cores) for a few hundred quid, and beyond that you can even hire absurdly massive amounts of processing resources via Amazon Web Services that are beyond the wildest dreams of somebody using computers 20 years ago, which lest we forget was when the newly announced (and probably unavailable) 233 MHz Pentium MMX.
Simply, sheer processing power is not the problem. The problem is that nobody has *any* more idea how to go about programming a general purpose AI than they had 20 years ago.
Sure, we can produce little modules that can solve simple defined problems like playing chess or do statistical work like heuristic analysis or bayesian inference which work very well for specific problems they are set, but that is not (IMO) an actual AI as defined by anybody other than some marketing wonk somewhere. They aren't self aware or capable of defining themselves problems to solve and we are no closer to producing a real AI today than we were 20 or 30 years ago. I see no reason this won't be the case in 20 or 30 years time.
Fortunately or unfortunately, since Stephen Hawking is probably right that a truely intelligent AI could/would be dangerous. (to anybody who can't figure out how to go and trip the circuit breakers in their house/office. Being in a rouge AI controlled, keyless car could be more exciting if the fusebox stays in the bonnet.)