Re: Oh dear.
The reason there's no PERCEIVED Moore's Law improvements is that the software really isn't taking advantage of the hardware.
With the exception of virtualization, a 6 core (12 hyperthread) Ryzen processor idles 11 'cores' nearly all of the time. All of those transistors NOT being used.
Occasionally you'll see something that uses them, maybe a very special-written game or an application where its author(s) know something about symmetric multi-processing and multithread algorithms.
"my Ph.D. 25+ years ago involved these neural nets implemented on chip,"
Yes, this puts yuo in a unique position to see why the article is relevant, that's for sure. But seriously, when will this translate into the user perception of "faster" ?
With the exception of natural language speech, visual scanning and object recognition, and other things that a ROBOT would need, most people aren't seeing improvements.
So to most of the world, Moore's Law is dead, but only because of perception.
And the biggest reason for that is SOFTWARE, not hardware. Because, after all, hardware has gotten 'wider', and not faster LINEARLY. And WAY too many people that call themselves "engineers" still insist on thinking in a straight line. Well maybe that's just PROJECT MANAGEMENT doing that, engineers are creative and of course think non-linearly, but you have to be able to turn that non-linear processing nto a program... and I haven't seen a lot of evidence of that happening effectively enough to give the user the perception of "faster".
(clogging everything with bloat and feature creep and changing the UI into 2D FLAT hasn't helped at all but it makes SOME engineers *FEEL* like they "did something" to "improve it")