
Revocation, that's what you need...
If you can't revoke a credential, then you do not have any security.
It's very simple.
Would you buy a house if you have to knock it down to change the locks?
Intel is investigating reports that BootGuard private keys, used to protect PCs from hidden malware, were leaked when data belonging to Micro-Star International (MSI) was stolen and dumped online. It's understood the private keys were generated by MSI to use with Intel's BootGuard technology, and were among internal source …
Downvoted because although the sentiment is true, the details aren’t. The baked-in thing is a public key (so it’s not a secret). The problem is the lack of revocation and key update mechanism. There are at least a couple of ways to do this, *even in e-fused hardware*, so if those weren’t considered and provided for by Intel, that’s a bad, bad mistake.
The obvious way is to store (at least one) backup public key on the hardware. When (not if) a breach like this occurs, you push out a firmware update, signed by the existing key on a last-time-use basis, that invalidates and swaps to back-up key and blows a further efuse to indicate this is done. The backup key is generated and stored separately, preferably in escrow. There is no need at all for the primary organisation to have knowledge of the backup secret, so this is secure against organisational breach.
My first version of this post contained my proposal as to how we would have to design PCs (desktop, laptop, embedded, etc.) to be BIOS-secure, but realized the flaw in my proposal.
The flaw is, no matter how things are physically-designed and implemented, because there are so many computers, the value (economic and/or ideological) of the plum of Manufacturer X's BIOS code, or of their private signing keys, is so great that government intelligence agencies, organized crime, and terrorist groups will invest metric ass-loads of time and money obtaining them.
Whether those codes are kept in a safe with five combination locks, in the CEO's brain, or in the brains of their five top engineers, or where-ever, the amount of resources brought to bear to obtain them means eventually, they will be obtained by third parties.
Yeah, with the highlight of the top post here rightly pointing out that ideally no part of the chain would rely on non-replaceable/revocable keys.
By their example I would still buy a house with unchangeable locks given absolutely no other choice, but given any sane choice I'd pick the one that doesn't require chucking hardware in the skip whenever the inevitable screw ups land in our lap.
And as you rightly point out, they ARE inevitable. Either someone will break out the sandpaper and extract the keys, or a manufacturer will screw the pooch and leak them.
As painful as it is for Intel and their shareholders, until this effectively triggers a manufacturer recall nothing will change. Intel was a prime offender during the speculative instruction issues revealed in the wake of specter et al. Because they were not forced to address the issue of permanently vulnerable systems on their end it created a new permanent problem in the industry.
If Intel screws up bad enough, you have to buy new server hardware. This generates revenue for them. As a result, they slow walked hardware fixes for many of these problems, and under-invested in mitigations. They knowingly sold vulnerable hardware for years, including hardware where full mitigations came at a punitive performance cost that erased any performance gains on the new architecture. Now the keys to the kingdom are loose, and unless pressed, even new hardware will still be at risk to the same kind of third party leak.
The private key should have been stored on an enterprise-grade Hardware Security Module in a non-exportable fashion with no way to steal it without physical access, and even then no way to use it without knowledge of a valid authentication credential. Attackers can still sign things in a remote compromise scenario but what they can’t do is get a copy of the key itself, even if they have completely compromised the signing server.
If fundamental gatekeepers like component/firmware/driver manufacturers can’t get this stuff right, what hope will we ever have in getting to a point where every ISV does the right thing?
On the flip side, this does mean that Palladium will forever be made of Palisade…
Storing it on some enterprise grade whatsit would be great and all, but they could have avoided this problem by storing them on a $5 USB stick as long as it wasn't plugged into anything.
That tiny security step would have made them basically impregnable to any non-physical attacker.
Just keep them in a drawer, and only plug it in when you actually need to sign something. How hard can it be?
Flip side is that it may need a revocation server. Either checked at system startup or at some other point by the OS. (Or worse, code running in UEF, or at an even lower than ring -1I)
Valid keys being revoked at corporate whim? Not sure this is the answer either .... this whole process is just broken.
That is indeed basic key management precaution, but I don’t consider it sufficient. There *should* be a backup key, held organisationally separate (and offline) plus key revocation protocol.
The most important point is “organisationally separate”. Otherwise, all that happens is the blackmailed/greedy Head of IT says “bring me the HSM”, and that’s the end of you.
Seriously, there is something really broken in a system where the OS can't tell if the CPU is fiddling in the background...
Yes, you might want to virtualise the OS for all sorts of reasons and then of course it is not really in control, but that first booted system ought to know. I mean, actually know what is happening. The presence of code running out of the OS' sight is something really creepy, not just for boot-level malware but for trusting a PC you bought from some country that is currently at odds with your own government, etc.