Re: "Intel dumped full 16 bit support from x86"
Actually AMD dropped first the Virtual86 mode in 64 bit mode because AMD doesn't know how to design an advanced chip.
What are you on about? It was dropped in 64 bit mode as it was an opportunity to remove legacy cruft. Virtual86 mode was/is still available in 32bit mode. The silicone is there.
Also, what do you mean AMD doesn't know how to design an advanced chip? The designer of the AMD 64 bit stuff was one Jim Keller, who's previous work included the DEC Alpha. You know, that 64bit chip that was one of the most powerful in it's day?
It also dropped the segmentation model which will need to return if we warn really secure applications at the hardware level and not address spaces where everything is writable and executable,
Unless you set the 'not writable' or 'no execute' bits, both of which are enforeced in hardware.
But AMD can see only "performance, performance!" (and Intel followed, with the Spectre/Meltdown bugs) and so removes any feature to truly isolate applications and kernel.
Eh? Intels first superscaler CPU with cache was the Pentium Pro, in the mid 90s. Spectre/Meltdown was a side effect of this design decision that took YEARS to be discovered, please don't pretend like it's an obvious issue that should've been spotted at the design stage, it wasn't. It was also nothing to do with AMD. (AMD were still making mediocre 486 clones when the PPro was designed).