Re: Doesn't look like a vulnerability
>AMD 64 had unencrypted microcode updates? >What a mess.
- No mess has ever resulted from such unencrypted updates.
- "Security" through obscurity has never worked and will never work.
- Stopping the good guys from finding vulnerabilities in things (who usually give up when faced with encryption where they cannot acquire the key of), prevents the good guys from finding vulnerabilities and telling you.
- The bad guys have far more funds and motivation to do malicious things and therefore won't hesitate to decap chips to extract encryption keys and reverse engineer to find vulnerabilities and then use them without telling you.
>supporting such a feature would just be too expensive.
Is documenting the update function and instruction set really that expensive?
You'll probably save a bit of money, as you wouldn't even need to fix many design mistakes as someone would fix it for you (despite the massively increased difficulty of not having the hardware design to reference).
The handcuffs can even be implemented as usual and a header or trace or pins on the chipset that disables the signature check (pull the header, cut the trace, or bridge these pins to disable the signature check).
It appears the real reason is how AMD doesn't want people to fix bugs and continue to use such perfectly fine CPUs - AMD wants them to throw them out and buy a new one to get the new "security fixes" - preferably every year, or at least every few years once microcode updates are no longer provided.
Why doesn't AMD do an experiment on some old, "long obsolete" CPU by documenting the microcode update feature and instruction set and publish the source code of one update under a free software license (remove the part that adds a backdoor of course) and see if a massive cracking spree happens?
>what you really, really don't want is an attacker gaining access at that level. Really, really, really.
If an attacker has ring 0 access and can update microcode, you were long gone.
Microcode updates are actually one of the least risky things as they don't have any kind of power-off persistence - as long as you're sure the SPI chip is free of proprietary malware and other hardware doesn't have the ability to update the microcode, you're looking pretty good.