Always wondered how modern CPUs really work. Just have to wait for the "Dummies guide to Intel Microcode" book to come out.
Boffins release tool to decrypt Intel microcode. Have at it, x86 giant says
Infosec boffins have released a tool to decrypt and unpack the microcode for a class of low-power Intel CPUs, opening up a way to look at how the chipmaker has implemented various security fixes and features as well as things like virtualization. Published Monday on GitHub, the Intel Microcode Decryptor is a collection of …
COMMENTS
-
-
-
Wednesday 20th July 2022 22:04 GMT david 12
Well, when I was a student, microcode design was part of what 'microprocessor design' was. We started with TTL, and designed a microcoded processor.
Anyway, this is part of the history of RISC. Researchers developed advanced compilation techniques that allowed them to optimize RISC code, and thought that RISC processors would be the wave of the future. Then Intel implemented the advanced compilation techniques in microcode, and offered it as a CISC package.
-
Thursday 21st July 2022 23:20 GMT Fifth Horseman
Quite. If you are as old as me, you probably started with a 74181 ALU and worked from there. Good enough for the the first generation of VAX... I think it is something still worth doing today, assuming you can find the parts.
Not so sure about your RISC vs CISC comparison, though. True, the Intel P6/Pentium Pro CISC architecture has roots in the i960 RISC, just as the AMD K5 architecture is derived from the Am29050, but the arguments are much older - dating back to an analysis of IBM 1401 code, I believe - and are probably more about philosophy and ideology than implementation techniques.
-
-
Monday 25th July 2022 19:41 GMT ITMA
"Microcode isn't really part of the how the CPU operates. Its just code that the processor runs to do complex actions."
You mean like twiddling the bits necessary to operate all of the CPU's lowest level core functions that allow it to execute the CPU's "official" instruction set instead of having to design hard coded logic to do it all?
-
-
-
-
Thursday 21st July 2022 15:53 GMT Michael Wojcik
Re: Microcode Security
It's a fairly narrow branch of the attack tree. The attacker would still have to get the new signed microcode onto the machine and get it loaded into the CPU. That requires privileged access, and if you already have that, the additional advantage of putting it in microcode is slight for most use cases. It's a nice way to get a really hardy APT into a system, but for the vast majority of cases a regular rootkit would be sufficient.
-
Sunday 24th July 2022 15:01 GMT Clausewitz4.0
Re: Microcode Security
QUOTE: for the vast majority of cases a regular rootkit would be sufficient
I agree on that.
But a microcode-rootkit would be the most undetectable piece of nasty code, and could be activated remotely without triggering any alarm bells - actually, you could even submit a sample to any famous sandboxes, and they would not flag nothing malicious at all.
Probably the reason China and Russia insist on using home-made silicon for mil/intel/sensitive stuff.
-
-
-
-
Sunday 24th July 2022 18:43 GMT John Smith 19
What happened to the good old days
Well.....
Microcode was first discussed by Maurice Wilkes in 1951 *as a way of simplifying computer design.
AFAIK only the 6502 (of that generations processors) was directly coded, which may explain why it's 16bit successor used in the AppleIIGS was a PITA to design and took so long to get to market.
Just because a processor used a few formats doesn't necessarily mean its hard coded. The Transputer's byte length encoding about as simple as possible, but actualy implemented on top of an even simpler micro machine.
TBH it's all about the tools you have available. ARM was done by generating logic signals with PLA's. Put enough PLA entries to cover every possible input combination (or "Address" if you like) and start consistently labelling the output bit patterns (call them "micro instructions") and hey presto it's become a microcode ROM. It's also likely to be a lot bigger.
*However before he died (in 2010) Wilkes looked at Babbages Analytical Engine designs (sometime in the 80's I think) and concluded that the "Barrels" in the design were basically iimplementing microprograms to provide the instruction set (this is around 1834-38, IOW 1 century before Alan Turning). Babbage also developed multiple notations to track the mechanical, logical and temporal behaviour of the design. IOW he'd also developed EDA support before he had a machine to run it on.
When I hear people talking about something being "On the next level" I think of Babbage. If his notations had taken off it's impossible to say what the world would be like now. Makes you wonder what other stuff is in the archives somewhere,,,,
-