Vendor?
Good job it's 1960 and you buy your hardware, operating system, application and language from the same 'vendor'
Perhaps this will rescue IBM and stop the feds using any of the communist cancer opensource stuff Balmer was warning us about
The White House has published software security rules for federal agencies as part of a larger push to shore up America's IT supply chains. Today's requirements [PDF] stem from US President Joe Biden's cybersecurity executive order from May 2021, which was in response to the SolarWinds disaster and other high-profile software …
There was an article in The Reg recently about proposed (or draft) EU AI regulations (https://www.theregister.com/2022/09/11/in_brief_ai/). The article had the attention grabbing headline "Draft EU AI Act regulations could have a chilling effect on open source software".
Now we have something different. Sure, it's in this article: "if an agency wants to use open source software or products that include open source code — then the White House says a third-party assessment by a certified Federal Risk and Authorization Management Program (FedRAMP) assessor organization will do."
Something as 'simple" as /bin/ls will need to be assessed, and reassessed the next time it is updated. There will be a cost to this. How in-depth is this going to be? Is it going to be OpenBSD-type audit, or is it going to be skimmed through (in which case it will be a tick box useless thing)? It is well within the expense budget of commercial firms to just add it on as a cost. For open source it is not so clear.
Taken literally, it would ban all software. "Secure" in an absolute sense is meaningless. You can only be more or less secure, and only under some threat model.
NIST SSDF (SP 800-218) refers to "secure software", which is not a technically meaningful term, but fortunately the actual practices are better specified. They're broad, but they don't assume perfection. For example:
PW.1.1: Use forms of risk modeling – such as threat modeling, attack modeling, or attack surface mapping – to help assess the security risk for the software.
And then there are examples. SSDF is pretty similar to some SDLC programs already used by many software-development organizations. If you're already making a serious effort in this area, it's probably not a huge cost to harmonize what you're doing with SSDF.
My understanding is that FedRAMP is more complicated, but I've only skimmed the surface of that.
I caught that too:
"a self-attestation from any third-party software providers"
I've been deeply involved with vendors, enterprise applications, NIST, PCI-DSS, regulated industry, and the like for decades and I can assure you from personal experience that such attestations aren't worth the bytes they occupy.
For that matter, audits often don't uncover issues - in fact they frequently don't even look for them. Worse, many audits rely on interviews with management. This is true for external audits and internal audits are even more problematic.
In my opinion, a couple of things must be in place:
- Ask the people in the trenches and guarantee confidentiality.
- Make the auditors and executives personally liable for proven negligence and inaccurate (or false) statements and claims.
- Set up an external reporting solution.
- Automatically publicize issues x days after notifying the responsible parties - even when remediated.
While this a good thing as an aim, it will fail as it does not address the many issues with software development.
First you attest your software is developed with CIST, what about all the libraries and bits and bobs you have half inched from the FOSS world? Do you have to attest to them (Log4J anyone?).
Does this mean the project will be twice as long as we re-invent the wheel to ensure code security and can attest its security?
How long will that last when all these projects start to fail because of cost and time overruns so they stay where we are on 20 year old software code as it is too difficult to move on?
I've seen great code, and I've seen bad. This unworkable-seeming prescription is an attempt to counter the half-assery present in so much of the bad code.
Recently I scratched the surface of a package called TomsFastMath, used in a virus-scanning app. I've yet to see even one comment in the code! The function names are so-tersely named I have no idea what they're supposed to do, let alone whether they're doing it correctly. I've no problem with the stuff in the K&R books, or in 6th Edition Unix, so I don't think my lack of understanding TomsFastMath is my fault.
Disclaimer: I am not a math Ph.D.
Software written for real time systems is not flakey and full of holes. Its a bit more conservative than the applications code we're used to with far less focus on getting the latest features out ahead of the competition (real or imagined) and far more attention paid to testing and verification.
Its a different mindset, one that would drive a lot of programmers crazy but its the way that software should be written. After all, for most applications we don't need to change it every year or so -- call me boring but I would be a lot happier using a word processor program from a couple of decades ago than fighting with whatever interface Word has thrust at me this week. Most software features are redundant eye-candy, anyway.
Software written for real time systems is not flakey and full of holes
And software written for planetary missions is even better and can be remotely debugged and fixed.
So all software should be written by JPL and similar programming teams. The upside would be incredibly reliable software. The downside is that we'd all be working with obsolete 16/32 bit processors designed 25 years ago. But as an extra bonus they'd be rad hard.
Surely this is what a Software Bill of Materials is for?
And it'll just end up with companies having to make a choice - adopt open source projects and actively participate in them, or build their own. The former is much cheaper, so this might actually improve FOSS participation and support.
Frankly, it's long overdue. Whilst some companies are good citizens, too many of them have been freeloading for too long...
Is this the same federal government (and the same person who was then vice president) that wanted to mandate built-in backdoors in software?
https://www.cnet.com/news/privacy/report-feds-to-push-for-net-encryption-backdoors/
"The Obama administration will seek a new federal law forcing Internet e-mail, instant-messaging, and other communication providers offering encryption to build in backdoors for law enforcement surveillance, The New York Times reported today."
"Communication providers, apparently including companies that offer voice over Internet Protocol (VoIP) services, would be compelled to reconfigure their systems so that police could be guaranteed access to descrambled information."
https://www.cnet.com/news/privacy/report-feds-to-push-for-net-encryption-backdoors/
p.s. A BSOD icon would be appropriate.
The spirit of the idea is well intentioned. Better software out the other end.
Reality would suggest that actually sticking to this means creating much smaller software; adopting the direction of certain languages (Rust) and having much, much more QA at publication.
See also, flying pigs; and little green men from mars for more likely outcomes than software genuinely changing development direction so sharply.
This is just the first blow in what could be a long and difficult fight.
But only if software developers want it to be.
These problems have been solved before now. In the 19th century there were plenty of people swanning around calling themselves doctors. And killing people through incompetence and ignorance. Doctors (real Doctors, hence the capitalisation) simply shrugged and said "What can we do? We're *proper* Doctors who do things properly, they're charlatans and snake oil salesmen who call themselves doctors."
Then the government said "Well, we could regulate you."
And suddenly Doctors were remarkably interested in doing something themselves. Associations were set up, accreditations required, boards put in place to review conduct... all so that they could then say "Hey, Mr. Government Man, how about just using what we have already rather than having to do it all over again to regulate us?"
The same thing happened with engineers. People who knew a bit of woodworking were wandering around building larger and larger things, without knowing quite how to do it at that scale. Buildings and bridges collapsed. People died. The government stepped in and said "Hey, maybe we should regulate this?" Engineers chose to organise rather than accept that.
Make no mistake, the same thing will happen for software development. People's lives now depend on software. Assisted driving and self driving has already claimed at least one life, probably more. At some point it'll get too much, and the politicians will be looking for a scapegoat. Rather than attack the companies, they'll realise that they can attack the developers. They're highly paid, they often associate with fringe groups in society, and some (not all) aren't very skilled socially. A perfect target for the ire of an angry public.
The software industry needs to get ahead of the game and organise. Not unionise - but have professional bodies as Doctors and Engineers do. Because in 50 years time people will look back as this and say "It all started when the Government realised it needed higher standards for its software. But they were purchasing guidelines, not regulations. Then, after some high profile deaths, things got shaky and there were calls for more Government action, so they took those guidelines and proposed regulation. Software developers weren't happy with that..."
Software developers can choose how that story ends. But the sooner they make the choice, the better the outcome will be for them.
The problem is money. Companies are always going to chose on price for software they buy, or 'engineers' they employ, unless there are real consequences for screwing up.
For years software suppliers have got away with EULA that absolve them of practically everything, unlike any building or physical product. What is needed is regulation that enforces acceptable EULA that imposes quality/reliability/security requirements on suppliers. All suppliers. To make it an even playing field between those who can and do produce good tested software at high cost and those seeking to push QA on the uses. Like MS of late, etc.
Yes, that could be an issue for open source but not if anyone who uses it commercially is on the hook for problems, maybe they could pay for some support and QA?
@Philip - a well-written reply, but the premise seems to be flawed nonetheless. While Doctors did indeed regulate themselves (with even heavier government regulation on their profession), they have also been known to be shortsighted, slow to acknowledge the validity of new information, be fraudulent, engage in conspiracies (e.g. Tuskegee), plagued by misplaced pride, and a host of other maladies.
All from real Doctors...
About 25 years ago I worked on a clinical trials system, which by virtue of being available in the U.S. had to adhere to FDA (part 2 IIRC) development guidelines.
Every release had to be audited under those rules and man were they strict - they even audited the comments to ensure that the code they related to were valid.
This obviously makes sense when people's lives were at stake in case of a bug.
Even though I now develop trading systems I still stick to those guidelines, as I think it made me a better developer.
I wonder how this is suppose to work for all the open source being directly used by in-house government development, and not open source which is "included" by third party products. Are they suppose to spend millions (maybe billions, across all agencies) porting existing software to use proprietary alternatives plus any perpetual licensing costs? If they have to wait/pay for external open source proxy certifications, does that mean they won't be able to upgrade to a newer version in the event of a security vulnerability if forced to wait on a new certification (even if a fixed version has been available for weeks or months)?
This is rich! The Government who couldn't build a basic website for buying health insurance and still maintains decades old systems that rely on running Cobol, is telling programmers to write secure software. At the same time they are integrating open source and IoT into their systems, unable to keep systems patched and updated, use weak passwords and poor security practices. Write your own secure code geniuses!!