Personal opinion here
On a few things
Windows 8: People do not like change. I am a prime example. When they changed to the ribbon in Office 2003 (i think) I tried it for a few months, didn't like it and migrated to OpenOffice. As Microsoft made more of their tech with the ribbon, I migrated more of my software to alternatives. Now the only microsoft product I have is Windows so that I can play games on Steam, and that isn't entirely necessary these days either.
Because of Microsofts forced "we've made a new OS, every computer must have it" approach, those who do not want windows 8 are not going to buy a new PC.
Then there's the business aspect. On the one hand companies are probably shelling out a lot of money right now upgrading to windows 7 awaiting the xp cutoff. At the same time, very few if any companies are touching windows 8.
And finally the tablet / phone arguement. I kinda agree here as well. For a lot of people who just surf the web and visit facebook, a PC isn't needed anymore. That's a large chunk of the market that no longer exists.
On to the hardware arguements.
CPU: The last big "here's the new CPU" launch in my eyes was sandybridge in 2011. Anything much higher is a waste unless you want bragging rights. As a pure example of this. Gaming, how many games these days require you have a sandybridge quad core CPU? Not many, hell even the more demanding games I see these days are still listing the cpu equivalent to a q6600 as their recommended (or higher) the cpu age is dying/dead The only people buying new top of the range computers these days are out there for bragging rights, or because their old PC died and they don't want to get left too far behind next generation.
GPU: Again there's less reason to upgrade. My GTX260 can still play the vast majority of games with everything set to high and get a respectable framerate. Can it get the frame rate up to the 100 range? No, does it need to? no. Anything above 60fps is once more bragging rights with no real requirement in the real world (unless you have a monitor with a 100hz refresh rate)
Yes GPU/CPU are getting far more powerful, but nothing is making use of them. The CPU is limited by the fact that multithreading is still an alien art known by few and mastered by fewer. Until somebody invents a language that makes multithreading easy to understand, track, control and debug it's going to remain that way. And the same goes for the GPU but for other reasons. Videogames are already costing millions, a large chunk of this is the cinematics and graphics. Until they can find a way to lower the cost of making these high quality graphics they're going to remain at the current level, anything higher has no benefit to the games companies on a cost / quality scale.
I honestly believe the next graphical leap will only come when GPUs get to a stage that they're so powerful, they can render almost anything in realtime. Why do I say this? It's relatively easy to make high quality graphics, the main cost is cutting them down. As an example you can have, 1,000,000 vertices per rendered scene. The artist makes the main character concept and that alone takes 1,500,000 so they have to spend time refining this model, removing vertices and faces while maintaining the maximum quality they can. The initial model may take a month to make, but the attempts to lower the vertices on it while keeping the quality high may take another month or two.
I think that's part of the reason mobile games are doing so well. They don't have the expectation to be AAA quality graphics, so they don't go in with the intention of lifelike realism wit ha limited number of vertices. Instead they go in with the intention of making a fun game. It takes months rather than years, is aimed at gameplay rather than graphics, and costs $10,000 rather than $10,000,000