Re: The death of CISC?
I have been programming ARM for a very long time. I have actually used a couple of Apple computers. One had a 6502 CPU and the other a 68008. I have never done any ARM targeted development on an Apple computer. It used to be test and debug on x86 then use a cross compiler. Later ARMs became fast enough to host a compiler directly so I could use x86 as a terminal and ssh into the ARM system. Lack of an ARM CPU on the host system was not a barrier before Apple eventually switched to ARM. When that happened it was a complete non-event for me programming ARM.
The first barrier to ARM growing up was most people wanted to run legacy DOS/Windows binaries. Although that is still a significant requirement for some people a larger market opened up: people wanted to run software for which they had the source code. With the source code they could compile it for whatever architecture was most cost effective. The next barrier was boot code: ARM has a long history of requiring a boot loader created specially for every micro-controller and the board it is soldered onto. To some extent it is now possible to have a BIOS that will work with more than one specific variant of ARM chip and so have an ARM mother board that could in theory be upgraded by replacing the CPU.
The thing getting to x86 now is the thing that made x86 dominant decades ago: price. Intel used to be the cheap and nasty CPU. There were much better architectures around decades ago, but you paid for that clean efficient design because it performed so much better than Intel. As a result, Intel got used whenever price mattered. Intel got the big (cheap) market. Their R&D costs got divided by a much bigger market so they had the budget to improve. The others then had their R&D costs divided by a shrinking market and their designs stagnated.
Intel became an effective monopoly and priced to match. They created a vicious competitor for their own low-end chips: themselves. Any cheap CPU sold took away an opportunity to sell a high margin CPU. Any factory time spend making cheap CPUs was factory time not spent on making high margin CPUs. This left the bottom end of the market to others. It has taken decades, but the high volume of cheap ARM CPUs funded development of slightly better ARM CPUs. This repeated until ARMs became fast enough for more and more demanding tasks.
The performance/£ and performance per Watt gained such an advantage over Intel that Intel's biggest customers decided to solve ARM's problems for themselves. Apple have solved ARM's problems for Apple. Google, AWS and some others took a more community based approach. Apple is happy to use BSD licensed software but much prefer to create their own than share their work as would be required if using GPL software. The others can work with or around GPL: If the software never leaves their servers they haven't distributed it so they can keep improvements to the source code to themselves.
I would say that ARM getting to servers has been more to do with everyone but Apple deciding to work to some extent with each other.