First read it as "daft" law
Whatever.
An attempt by lawmakers to improve parts of the US government's cybersecurity defenses has raised questions – and hackles – among infosec professionals. The National Defense Authorization Act for Fiscal Year 2023 – which, if passed, provides billions in funding for the American military and other critical areas of the …
"free from all known vulnerabilities or defects affecting the security of the end product or service"
All the vendor has to do is stop effective security testing. Then by default they won't know about any such bugs.
What the law really should do is require a full security test report from the vendor, against criteria set by the purchasing agency, and preferably conducted by an independent thrid party.
If the d(r)aft law really says
A certification that each item listed on the submitted bill of materials is free from all known vulnerabilities or defects affecting the security of the end product or service.
it doesn't really matter if there are bugs as long they are not already known.
I don't see why a strict reading of this would it make impossible to buy/deploy any new software.
Forcing the software suppliers to include a software bill of materials that states any and all libraries and version number used in the product and getting a certification that they contain no known vulnerabilities at the time of purchase is a first step.
Somebody will have to keep track of all that information and match it to any new vulnerabilities that are discovered to then check if mitigations measures or fixes need to be applied.
Unless the lawmakers although outlined how that process is to be done and estimated the costs and allocated the necessary funds it won't significantly improve security.
A competing company can search your product for anything they can vaguely claim as being a security defect.
When they find something, they can sit on it during tendering, then publish it just before the contract is signed/product delivered.
It's then a "known" defect, and you're stuck.
Worse, they can do that for things which aren't actually a security defect, forcing you to investigate, then prove it's not a security issue.
It's trivial for companies to abuse in such a way that their competitors lose business but products don't in fact get any more secure.
If the result is that companies test each others' products and publish the vulnerabilities they find, then that's a win in itself.
But to my mind the likeliest reaction would be a huge decrease in using prewritten software and an increase in big companies, who can afford it, writing their own special-purpose proprietary software from scratch. If it never gets deployed anywhere public, it won't be tested, therefore the vulnerabilities are much less likely to become known.
Of course it will probably be way buggier than code that has been deployed and used publicly, but hey, at least they'll be able to certify it.
If you don't trust the kind of governments that can pay virtually any amount of money, then that's a problem but not one a lot of people share. If the government really wants ten thousand licenses, they're likely to pay for them and possibly use the larger contract to get other stuff they want, such as priority for bug fixes and feature requests. They do have some pretty good reasons not to want their computers connected to the public internet so your license server, which they haven't audited, can accept a key and potentially log information about where that license the government bought is being used.
"...submitted bill of materials is free from all known vulnerabilities or defects affecting the security of the end product or service."
It does not sound ambiguous to me, especially since the supreme court is full of constitutionalists.
What this would stop is selling software with a known bug - like Apple selling IOS to Uncle Sam with a known security vulnerability, like a VPN data leak for instance.
Oh wait...
So you start off with saying that you can't buy software with known security exploits. Fair enough. However, you then go on to say you can ignore all that if the vendor promises they're going to fix it. Isn't this basically just best practice anyway? Why would you buy software that you 1) know is buggy/exploitable, and 2) that the vendor has no plans to fix? Sure, the Trump administration exposed how a lot of things we always figured were laws were really just convention/best practices, but generally these sorts of purchases are handled by the career civil servants who will be around during the next administration, and the one after that, and be supporting it the entire time.
What is needed is a requirement for vendors to fix bugs in any software in less than X days after notification and for Y years after it was sold, or face big fines on behalf of everyone.
All complex software has bugs of some for or another, but what differs between companies is the way they fix (or don't) those bugs and the time it takes to do so. If they turn out crap software due to piss-poor QA (looking at you MS) then they will have to work hard to fix it or pay up a LOT!
"A certification that each item listed on the submitted bill of materials is free from all known vulnerabilities or defects affecting the security of the end product or service."
Any such item would be unpowered, embedded in meter thick reinforced concrete and on the Dark Side of the Moon in a deep crater.
> "Unfortunately, it's typical behavior of our legislators to issue mandates that describe the 'what' but not the 'how,'" he said.
Isn't this exactly what experts in every field keep telling legislators to do? Stick to defining the "what" - the political objective - and don't prescribe the "how" - the technical implementation?
When the EU proposed a unified smartphone charging standard (the "what"), people were most up in arms about it naming a specific technology, MicroUSB (the "how")!
"A certification that each item listed on the submitted bill of materials is free from all known vulnerabilities or defects affecting the security of the end product or service."
There is nothing wrong with this at all. Certainly the complaint that "not all vulnerabilities are signifcant" is pointless in the context of the above wording. If the vulnerability is not significant then, by definition, it is not "affecting the security of the end product or service".
That would be nice, but it wouldn't fix the problem about security. You can build insecure stuff out of open source components. Quite frequently, the problem addressed in the SBOM requirements is that third-party dependencies, often open source ones, have vulnerabilities and the user of those dependencies didn't update their product. Sometimes, those dependencies have vulnerabilities but nobody bothered to fix them, including the original maintainers. Just making everything open source won't fix any of that stuff.
Not even a little bit, since they can still go ahead as long as the existing vulnerabilities don't affect security or there is a mitigation plan. At best it encourages disclosure --- and only of KNOWN vulnerabilities at that. If the biggest flaw imaginable is discovered the next day, it "was there", but it wasn't known.
So you don't use any software from Oracle, IBM, Microsoft, Google or any other US provider or consulting firm?
Because none of them is safe nor properly tested, based on my experience...
(and I don't say that because I got a bunch of mails today about new vulnerabilities in products from the above firms)
Isaac Asimov's The Naked Sun was the second of his Daneel Olivaw novels, the sequel to The Caves of Steel. I once had a paperback editioin of that book, which said (in the back cover blurb) the world would not be safe until it understood the implications of a new kind of robot that the Three Laws only prevented from knowingly doing harm to humans. Apparently, the earlier robots were able to magically foresee the consequences of their actions without conventional sources of information.
You know what they say: They don't make 'em like they used to!
I was reminded of this when reading that the U.S. government was going to make its computers secure by only buying software without known flaws. Which I suppose means they can only buy Windows after Patch Tuesday, not before.