Intel FSP 3 NSA backdoor
With 5 eyes countries mandating access to encryption, this is Intel's answer.
Vendors of the FOSS hardware and software communities are voicing their concerns about closed-source firmware. Virtually impenetrable BLOBs (Binary Large Objects) in firmware mean it's difficult to be sure exactly what the computer is doing. Assuming the BLOBs are unencrypted, and they usually are, you'll have to break out a …
It's not widely known or advertised as such, but the Purism kit still requires the Intel Management Engine to boot up. Definitely not a good first choice for high security applications.
Typing from my Talos II workstation, nice and private but quite the stationary lump otherwise. The old Chromebook I used for mobile computing just isn't adequate these days, with software bloat everywhere. No really good options that combine portability, performance, and privacy, sounds like the old adage to "pick two" is the best one can do?
How do you find your Talos II? I'm fairly interested, and have been a POWER fan for decades. A first hand summary report would be gratefully received :-)
I did a lot of embedded work using PowerPCs, and can remember a time when it was possible to boot Windows NT on these boards (I saw this at the board manufacturer, some of their engineers had put that together, and at the time was deeply impressed by the performance). I've always hankered after a proper PowerPC desktop (not a Mac) since.
Happy to oblige, for what it's worth.
Some relevant background, I've been a staunch Linux user for well over two decades, and already moved my PC games to a separate system some years back as I refused to trust Steam with my personal information. I was also using an old AMD true open firmware PC (KCMA-D8) before moving over to the Talos II, back when coreboot meant open source not a build system around Management Engine and FSP binaries.
On balance it's a lot faster than the old AMD kit, and knowing I cannot be (easily) subverted at a firmware level is well worth the cost. Even using Tor makes sense on a platform like this, with full control of the system it's possible to control and manipulate the data spillage to various totalitarian regimes, and I've had fun with that in the past. Various packages install from the package manager like you would expect, no complaints on that end.
now Moore's law that the PC would double in power every 18 months is well established
but is there a corollary from that to use that would be along the lines of - as the power of the PC doubles, the complexity and number of levels of control will also increase, doubled if you are lucky ? :o(
I REALLY miss my little old 486dx2 66MHz beast, and almost getting nostalgic for Windows 3.1 as well ffs, even though I swapped it for W95 without so much as a backwards glance :o)
There is no law that says that firmware has to be open source.
Vendors have every right to keep some cards close to their chest, even the consequences are a very difficult time finding bugs and vulnerabilities.
On the other hand, blackhats will have a hard time as well.
blackhats will have a hard time as well
Hmm, security through obscurity has not shonw to hold up well over time. That said, being fully naked isn't the solution either - after all, you often have IP and methodology to protect.
There ought to be some middle path that a trusted organisation lifts the covers and has a peek, but you'd need multiples of those or you end up with a single point of subversion. See Arthur Andersen for audits..
All that really needs to happen is that the manufacturer is made liable for all of the costs of a data breach, if their firmware allowed it to happen and wasn't replaceable by the machine's current owner.
I'd wager there would be a lot more testing and a lot less firmware overall if bugs and backdoors became the manufacturer's financial problem instead of the machine owner's financial problem.
I also find that closed source empowers the black hats and stymies the white hats, for the simple reason that the black hat just has to find one way to crack the binary to engage in nefarious activities, whereas the white hat ostensibly has to locate all possible cracks and patch them. Having source code massively helps the latter and does not significantly help the former.
A lot of the problems with BLOBS have to do with wireless drivers, which are regulated in that they must not be easily modified to transmit on illegal frequencies or at illegal power levels.
Not saying it's a GOOD thing, just saying it's justified. Pointing to the REAL problem, government regulatory agencies and the laws that force manufacturers to abide by them.
Also not saying that a free-for-all in transmit power, frequencies, and RF interference in general is a good thing either...
(an imperfect world filled with imperfections)
as for things like video drivers (and I'm talking about YOU NVidia) they are difficult to fix bugs in if you keep them closed source...
Back in the old days, "firmware" really was "firm" because it was blown onto a ROM and could not be changed. Unless you physically read the ROM, you would never see those bytes. From the end user's point of view it might as well have been wires. After all, even CPUs had microcode, and nobody got upset about not being able to change the microcode; it was just part of the hardware.
The Free Software Foundation says it starts caring at the point where the bytes can be changed post manufacture, i.e. the "firmware" can be updated. Obviously a boundary has to be set somewhere, but it's hard to understand why this exact boundary: what if the updates are not compulsory and you could carry on using the first version of the firmware (or the version that was current at the time you bought the device) if you want? Would that not be equivalent to having brought it in a ROM as part of the device? Should they perhaps refine the boundary to say they start caring at the point where the updates become essential, for example because they are security fixes, instead of just saying they care about bytes that could in theory be updated but don't have to be?