back to article A Code War has replaced The Cold War. And right now we’re losing it

Remember the Cold War? For me, growing up in America meant living under the permanent, intangible threat of sudden vaporisation by thermonuclear attack. It added a piquant pointlessness to everything. Ashes, ashes, all burn down. Yet the world stubbornly refused to end. Communism collapsed, Western neoliberal democracy seemed …

  1. amanfromMars 1 Silver badge

    Sometimes it is best to lose graciously ... in order to save oneself from totally unnecessary pain

    Space may also point the way toward a solution with slides into an economic and military superiority. It may be possible that a similar approach - using "moonshot" technologies like artificial general intelligence and high qubit quantum computing - could place attackers so far ahead of defenders that defence and opposition and competition becomes effectively impossible and ruinously expensive and quite certain to guarantee a changeover into what would definitely be Greater IntelAIgent Gamesplay.

    What's not to like?

    1. amanfromMars 1 Silver badge

      Re: Sometimes it is best to lose graciously ... to save oneself from totally unnecessary pain

      Don't forget, El Regers, a silent downvote on pages and comments here, is a golden opportunity crazily squandered to share with all readers in far away fields, an alternative opinion and views which may or may not be full to overflowing with common sense worth listening to.

      Such is to be highly regarded and much prized, whether they prove themself to be valuable or worse than just worthless and excessively exclusively self-serving.

      1. Anonymous Coward
        Boffin

        Re: Sometimes it is best to lose graciously ... to save oneself from totally unnecessary pain

        I'll bite.

        First, there's no way of knowing if we're losing since our TLAs don't do data dumps.

        Second, hacking is relatively cheap and we won't be bankrupting Russia or China by upping our cyber budget.

        Third, attacking a state directly is a nuclear option (figuratively and, in some discussions in the past, literally).

        Which brings me to my conclusion. What kept the Cold War from going hot was the MAD concept - Mutually Assured Destruction. We need to reach an understanding with Putin and Xi about what is acceptable and what is not. Our big stick can be the Russian pipeline explosion and the Iranian centrifuge sabotage (neither of which we acknowledge).

        But we have to accept that not all state sponsored hacking rises to that level just as we don't attack Russia for the actions of their clients, the Taliban, in Afghanistan. For lesser hacks, I'd add doxing to the sanctions but, ultimately, it's a question of making our sites more secure.

        1. amanfromMars 1 Silver badge

          There is Only One Holistic System of Systems ... Networking AI Colossus

          Hi, HildyJ ...... and Welcome to another world.

          To imagine that any State assembly/Parliamentary body/Presidential executive/national collective anywhere, East or West, is responsible for the news that supposedly advises us of want we are to believe in and to do, with the serial infliction of pain and suffering or reward and pleasure on vast swathes of others, as surely the news and media certainly currently does, is something to consider is much more a perverse convenience rather than identifying the actual inescapable factual reality.

          The latter in truth, and actual factual augmented virtual reality, may very well be stated in this mightily overlooked and inconveniently prescient monologue ‽ . ......... "The World is a Corporation" (Network, 1976)

          No matter which though you might conclude to be the more acceptable and easiest to believe the more likely and possible and therefore probable, it will have just a few beings arranged as around a round table with a board of governors/systems drivers with the brightest and smartest of directors leading by virtue of their commanding contributions ensuring continuing stealthy supremacy of both their invisibly cloaked, renegade rogue and controlling non-state actor future activities with CHAOS.

          And all here on El Reg should already be well aware of what CHAOS certainly is. It has often been freely shared to keep y'all abreast and au fait with virtual developments so that ignorance is not rendered your only available guide to future shenanigans/upcoming COSMIC events.

          Test yourself here. Fill out what you think the above and following two expand into. The correct answers are easy enough to find, and you can have a lot of fun being silly and filling in your view. Take care though, because that can easily tell those who would be interested to know, a great deal more about you than maybe you would like them to possess ...

          CHAOS .......

          COSMIC .......

        2. Anonymous Coward
          Anonymous Coward

          Re: Sometimes it is best to lose graciously ... to save oneself from totally unnecessary pain

          They could actually stop hacking completely just by shutting down the internet and then rebuilding it without connections to those countries that condone hacking.

          That being said that would include pretty much everyone else where it comes to state sponsored attacks.

          So perhaps the answer is for everyone to stop cyber attacking each other and save their tax payers money for something constructive instead.

    2. Robert 22

      Re: Sometimes it is best to lose graciously ...

      "using "moonshot" technologies like artificial general intelligence and high qubit quantum computing - could place the defenders so far ahead of the attackers that assault becomes effectively impossible"

      What if the attacker uses the same technologies?

      We have a complexity problem - everything is so complex, there are bound to be weaknesses.

  2. KittenHuffer Silver badge
    Linux

    The Code War?!?

    I remember the Cod war!!!

    South of the Equator the winner was ----------------->

  3. Mike 137 Silver badge

    Yet another uncomfortable truth

    Unfortunately, the harsh reality is that software development is the only branch of engineering that doesn't have established standards and methods that are proven to yield trustworthy results. It took a couple of hundred years for such standards and methods to be worked out in civil, mechanical and electrical engineering, and it may just be that software development hasn't been around for long enough yet. But the big difference is the extent of deployment of the technologies while they are still immature. Software now permeates almost everything, but we aren't yet able to assure its adequacy. That's the real problem we face, and it can only be solved by establishing standards and mandating their application.

    In no other branch of engineering can a totally self-trained practitioner be taken on trust and tasked with delivering mission critical (or indeed life critical) systems. It's about time we stopped this practice in software development for those domains. So we need enforced standards that deliver safe systems, as we have for even for technicianship disciplines such as electrical installation and gas fitting. Nobody would argue for mandatory certification of games developers, but I would most certainly insist on it for those developing things that can affect livelihoods and lives, and such certification should ensure understanding of first principles, not just knowledge of the knobs and levers of proprietary tools. What we have had to date instead is a worrying trend of successive generations of tools that make it easier to deliver results without paying attention to the essential first principles - a deskilling that perpetuates the very problem we should be trying to solve.

    .

    1. Paul Crawford Silver badge

      Re: Yet another uncomfortable truth

      It is far worse than just the lack of "sound principles" being used, these days software comes with license agreements that abdicate responsibility of the consequences of crap code.

      What other discipline would get away with that?

      Add to that a lot of connectivity and inter-dependence being added is driven my marketing droids (or worse, advertising brokers) and the future looks bleak indeed.

      1. This post has been deleted by its author

      2. jmch Silver badge

        Re: Yet another uncomfortable truth

        The real root cause is lack of accountability. If a bridge or building collapses, the engineers / architects / builders involved in the faulty design / construction / certification etc are legally, and in some cases, criminally liable. Same with doctors / patients etc etc.

        Class software in (eg) 3 classes; life critical (failure causes loss of life or severe risk thereof) , economically critical (failure causes monetary damage above a certain threshold) , and not critical (failure 'merely' causes a bunch of pissed-off users). Apply criminal liability for failures of first class and civil lia for those of second class. This type of software will rapidly improve.

        The flip side of course, is that the world has become addicted to software that is incredibly cheap for all the immense convenience, without properly realising it. So critical software would become a lot more expensive, and slower to roll out

        1. Aitor 1

          Re: Yet another uncomfortable truth

          Yes, that worked grran in grenfell, the previous fires, the huge ammount of prefab buildings that were seriously defective in the 60s to 80s, etc etc.

          In practice they are still not accountable, as eeveryone did "best practices in the field", nobody caused the deaths, but between all of them they caused it. We would have the same shenanigans, with "lessons learned action taken" statements and thats it.

          That does not mean we should be happy with code cowboys doing their stuff, but most of the code cowboys I know do have a relevant degree, masters or even phd.

    2. ScissorHands

      Re: Yet another uncomfortable truth

      Several forests have been decimated to produce MITRE code standards, but nobody follows them (Minimal Viable Product is the law of the land, security be damned) and even following them to the letter, C and derivatives should be classified "Unsafe at any speed", even (or especially) legacy code. Rip it all up and start again with GC/RefCount/RAII languages.

      Nuke it from orbit, it's the only way to make sure.

      1. Anonymous Coward
        Anonymous Coward

        Re: Yet another uncomfortable truth

        So many people are producing code and shouldn't have been left close to a compiler.

        When I used to be a dev, back in the 90s, I always was baffled by the amount of mistakes of people that never bothered to even RTFM.

        C code, makefiles, all borrowed from largely bigger project, that couldn't work together, improper usage of unix tools, etc ...

      2. Someone Else Silver badge

        Re: Yet another uncomfortable truth

        Something about a poor workman and tools...

      3. Loyal Commenter Silver badge

        Re: Yet another uncomfortable truth

        I'd like to see you build embedded software, or code that must run in a resource-limited environment with a high-level language.

        Sometimes bit-bashing in C is the only way to do it, but I am inclined to agree that when it isn't the only way to do it, most of the time it's the wrong way.

        1. Someone Else Silver badge

          Been there, done that, got the tee-shirt

          I'd like to see you build embedded software, or code that must run in a resource-limited environment with a high-level language.

          I don't know who the elliptical "you" is in your statement, but try this: Piece of medical equipment written for an 80186 in C++ (OK, with a small BSP written in assembler), hand-written multitasking kernel (also written in C++), full 20-color GUI (this was the middle 90's; screens that would support this were not ubiquitous), multiple languages and fonts. So, let's see now:

          --> Embedded software: Check

          --> Resource limited environment: Check

          --> High-level language: Check

          --> Gov't regulated market: Check

          --> A metric buttload of oversight, review, and team and individual discipline: Check, Check and Check.

          --> Wrong way to do it: Definitively NOT check.

          1. Loyal Commenter Silver badge

            Re: Been there, done that, got the tee-shirt

            I wouldn't call C++ a high-level language here though, it's being compiled down to machine code, and still has low-level memory management (malloc, and pointers). It's really just C with-bells-on. Perhaps wrongly, I refer to C and C++ interchangeably.

            Languages like Java or C# compile to bytecode, which runs in an interpreter. Yes, you can run Python on embedded devices, but again, this relies on an interpreter, which makes it a lot slower than something compiled to run on bare metal.

            I'm not thinking of anything so sophisticated as something that can run a GUI either - things like Arduinos and Pi Picos that are being used to control things at a very basic level, and which have very limited memory and storage to work with. Things where available memory is measured in kilobytes.

            Sometimes you just don't have the luxury of the headspace to run even a cut-down kernel if you need to use that memory for things like shifting a lot of data about between I/O channels.

            1. Anonymous Coward
              Anonymous Coward

              Re: Been there, done that, got the tee-shirt

              Tektronix storage scopes of the 80-90s. Resource constrained but networked multi-processor systems (about 7 CPUs in the lower end models). All coded in smalltalk. Nice GUI, fast, responsive and they never ever crashed. Try achieving that in C!

    3. a_yank_lurker

      Re: Yet another uncomfortable truth

      As someone trained in other STEM fields now in IT, it is not being 'self-trained' that is the actual problem. Any junior programmer will need training on proper security techniques needed for the applications they are working on. This is not likely to be covered in any real depth in their course work.

      One major difference between software engineering is personal, professional liability requirements as evidenced by PE licenses. The requirements in Feraldom are generally an appropriate STEM degree, pass the 'Engineer In Training' test, work for several under the direct supervision of a PE, pass the 'Professional Engineer' test. The PE is only one who has the legal authority to certify the work meets all the standards and legal requirements and the project con proceed. As a non-PE, I can only work below one who would be supervising my work. The PE is personally and professionally liable for anything approved. There is no equivalent in software of a PE.

      1. Mike 137 Silver badge

        "it is not being 'self-trained' that is the actual problem"

        You're right. It's self-trained practitioners being taken on trust without any standardised formal way of validating their competence. Those trained by others undergo continuous (even if only informal) validation during their training, but the self-trained don't have this advantage, so they may not in the words of Dirty Harry even "know their limitations" themselves.

    4. Dan 55 Silver badge
      Stop

      Re: Yet another uncomfortable truth

      Why are we beating ourselves up over this? Customers expect security but don't want to pay for it, sales managers negotiate the highest price possible and try and reduce internal costs to the minimum, project managers just want it fucking done yesterday.

      It's a whole pile of stupidity on top of developers' backs. That's what's got to change.

      1. Mike 137 Silver badge

        "Why are we beating ourselves up over this?"

        Because people are dying already due to it. We're not just talking about desktops and phone apps here. Boeing Dreamliner generators shutting down due to a control unit counter overflow (fortunately caught before an accident); Airbus A300 in Seville losing throttle response due to missing parameters in an ECU (half the flight crew killed); Boeing 737 Max (no detail needed). Plus the failures of "semi-autonomous" vehicles to spot obstructions in the road before running into them; ordinary vehicles requiring regular "updates", etc. etc.

        The same "standards" of quality seem to apply across the development landscape and it's starting to matter a lot.

        1. Dan 55 Silver badge

          Re: "Why are we beating ourselves up over this?"

          And how can individual developers who care about the software they're writing and want to produce better work change corporate culture which is minimum viable product, lowest cost, fastest time to market, and box ticking without any meaningful work done behind it.

          Answer - they can't. Corporations and every single employee must be forced by law to follow standardised procedures. That's the only way there is going to be change.

    5. Loyal Commenter Silver badge

      Re: Yet another uncomfortable truth

      I'd argue that there are established standards and methods. The issue is that they are not enforced and are seen as a cost by those holding the purse-strings.

      We have building standards and inspections to make sure unscrupulous builders don't cut corners and build unsafe structures, but we have no equivalent in software engineering. This isn't because the correct techniques aren't known. SOLID, design patterns, test-driven development, Agile (done properly) and so on. It is because the bosses tell us, "just make it work, I don't care how".

      The problems start at the requirements gathering stage as well. Adequate security needs to be designed into software, otherwise it gets bolted on afterwards as an afterthought. Again, nobody wants to pay for proper analysis before a line of code is written. Would you let a builder loose to build you a house without getting an architect to draw up plans first?

      The root of the problem is that doing things properly isn't required. Not doing it properly is often cheaper, so there is a financial pressure on businesses to cut corners to compete. I'm all for legislation that says commercial software should meet certain regulations, such as requirements being fully documented and signed off by a professional business analyst, security considered at the design stage and specified, segregation principles followed, code documented, and so on.

      We'd all end up with better software, and most developers I know would love to be given a decent spec to work from without having to do the analysis as we go along.

  4. CrackedNoggin Bronze badge

    Is it possible that companies that take security seriously will survive while the rest perish? Of course, there is also "bad luck".

    1. Anonymous Coward
      Anonymous Coward

      "bad luck" is essentialy a historically accepted term meaning "it's too complex and I couldn't account for all of the variables" and should generally exist in a sentance with "perhaps if we had of sacrificed a goat on that day to the gods it wouldn't have happened" to show how out of place the notion is in the modern world.

      In the aviation industry it didn't take them long to come to the conclusion that wings falling off planes had much to do with design or construction failures and little to do with good or bad fortune. The "bad luck" of aircraft crashing has largely been systematically been engineered out of existance with unsafe aircraft grounded until they are made safe rather than pilots chanting incantations over good luck charms to invoke the gods good fortune and blessings for your journey, and the same applies to cars, trains, structures and pretty much anything else related to engineering.

      In IT terms "bad luck" in security tends to mean that there was a serious lack of security; if the enviroment you are securing is "too complex and you can't account for all of the variables" to go by my term above then one simple answer is to reduce the attack surface to an extent that is manageable.

      That exchange cockup the other day? It was via exchange allowing you to exploit errors in the coding of the login page without sending any authorisation.

      It's trivially protectable against; require your legitimate users to connect to your network via a VPN. The attack surface of your network is then reduced to the VPN login (and SMTP) both of which can be secured with technology and proceedures that were boring (and secure) twenty years ago. If you were doing this then you were effectively impervious to that entire class of attack unless the attackers start on the inside of your network. (which should be guarded against by other access control measures)

      Meanwhile plenty of companies (Over fifty thousand?) got hit because they staked a bet that code thrown out at an ever increasing rate with ever decreasing quality would be 100% safe to expose directly to the internet.

      And the internet these days is a very, very hostile place if my firewall logs are anything to go by.

      1. Claptrap314 Silver badge

        " If you were doing this then you were effectively impervious to that entire class of attack unless the attackers start on the inside of your network. (which should be guarded against by other access control measures)."

        Seriously? Were you found in a cabbage patch this morning?

        Even if, by some act of magic, your VPN was perfectly secure, that does close to 0 about one of your users who mistakenly clicked on an ad or clickbait article and now has been rooted.

        Your user's machines have been compromised. All of them. For quite some time. Now--explain your security posture with twenty-year old procedures is adequate.

        We still are not settled on exactly what should be considered proper MFA. Keeping up is going to matter for a while.

        1. Anonymous Coward
          Anonymous Coward

          That particular subset of information about perimiter security doesn't give you any information about what other security measures I have in place.

          What I am saying is not particuarly controversial, I simply say that if you write down possible methods you can devise to attack your network then you can also come up with ways of making doing that bloody difficult for somebody else and thus reshape the security landscape in your favour. You don't even need to spend lots of money.

          For instance, Group Policy can eliminate vast swathes of attacks without spending a penny; if you download the office GPO's then you can force a GPO for office to disable unsigned macros, which instantly eliminates the threat of macro viruses. (which el reg keeps reporting on, so I assume people still get them)

          You can also prevent office from downloading or running content downloaded from the internet that somebody has embedded within a document, and there goes another class of attacks generally without inconveniencing your end users.

          Applocker/Software Restriction policies can prevent people from running things, so by blocking users from running unauthorised things then you can in fact prevent people from getting a virus through running a trojan, or by visiting a webpage with a dodgy advert since both web browsers and email clints execute files in the %temp% dir, and by blocking .exe files etc from running here then you can stop them running in the first place, rather than allowing everything to run and then operating a blacklist of files to search for via your anti virus scanner, which is a never ending game of whack a mole. Limit the users to running things from %program files%, %windir% and your authorised files on the network and you can basically write another entire threat class off entirely if you are restrictive enough.

          And that's really only scratching the surface of what you can do with freely available tools built into every version of windows installed on the planet. But you'd have had to have sat down and looked at the freely available options that you could configure and figure out what you'd need to do in your enviroment to protect the users without preventing them from working. (because after all, you can apply policies to groups or even individual users...)

          Doing none of this and then getting cryptolocked/hacked/etc and saying "oh, that was bad luck" suggests to me not "bad luck", but "bad planning".

    2. batfink

      Unfortunately...

      While this would be an ideal result, unfortunately this isn't what will happen in practice.

      As we all know proper, security comes at a cost. Securing your perimeter is a cost, securing your code is a cost, etc etc.

      It's much cheaper to turn out unsecured crap. So, companies who take security seriously are at a financial disadvantage against those who don't - and so the former are likely to be more commercially successful.

      The actual cost of bad code isn't as bad as we would all hope. Remember TalkTalk? Did they go out of business, as they deserved? Fuck no. So, they can just go on being insecure and making more of a profit than their secure competitors.

      We would all like to see this done properly. Unfortunately the chancers who actually run these firms have a different view. IMO it'll only be when the regulatory penalties outweigh the cost of security that we'll see any real change.

      1. Keith Oborn

        Re: Unfortunately...

        And here is the nub. I was "inside" TalkTalk just after that event, and have also been "inside" their major competitors. It was pure chance that they got hit and the others didn't, as all have similar skeletons in the cupboard. The same will apply to any company that has acquired a smaller one. Security audits on acquisitions are slow and expensive, and the combination of accountants and shareholders won't wear them.

        Then we have the overall problem of software (and hardware/firmware) quality. Not only is this expensive - which means doing it puts you at a huge disadvantage in a competitive market - but also getting your development (*why* is it not called *engineering* I wonder ;-) to care is very hard. IN may last company the CEO gathered the entire team after a major release. "RIght: there are some bugs in that one. You will fixe them before we start on the next one" unanimous response "OH, we just want to work on the new stuff". Guess what happened? And those were *obvious customer facing* bugs.

        Until there is solid regulation of this industry - similar to aviation - we won't see any improvement. The likes of BCS with their "standards" are flies buzzing round a dinosaur. And yes, this sort of regulation will seriously slow things down. That is a *good* thing. The most execrable mantra in the industry is "move fast and break things". What if the thing that is moving fast is, say, a 737MAX?

  5. David Glasgow

    It's a double bluff?

    Cunning plan, in a nutshell:

    We're in, and we've got what we need

    Ok. In a stupid way, say we're going to do something stupidly

    Job for a Politician?

    Yup

    Should we actually do something stupid?

    It's always an option.

    Misdirection?

    Yup. And they'll carry on thinking we're idiots.

    Bonus!

  6. Pascal Monett Silver badge

    high qubit quantum computing and artifical general intelligence

    If we wait on those to solve our connectivity problems, we might as well unplug everything.

    I don't know what the final solution is, but a good start is to stop using other people's code with blind trust. Oh sure, take a module from GitHub, by all means, but don't link to it. Bring it in on your dev server, check the code, test it to see if it works. If it is suitable, then port that to your production environment.

    If there's an update on GitHub, start over.

    Yes, it is tedious and time-consuming. The alternative is SolarWinds123.

    Your choice.

    1. DJV Silver badge

      Re: high qubit quantum computing and artifical general intelligence

      Yes, this ^^^ absolutely!

      Recently, I've been looking into web-based page turning/flipping code. Most of the available options seem to come as black boxes that hauled in God-knows how many extra libaries (and all the unknown cruft that comes with them) and each one looks like it probably only does 95% of what I want (but each does a DIFFERENT 95%) - but adding the extra necessary 5% is probably impossible without weeks of work.

      Two weeks ago I found a simple piece of code that was about 7 years old and, while it was buggy and partly relied on the abandoned CSS Regions proposal that only Chrome implemented for a short while, I managed to pick it apart myself, gained a full understanding of how it worked and fixed it up so that it now works properly in a modern browser (all in a couple of days). I now have a usable tool that stands alone, has a tiny footprint, and can be maintained and improved by myself as necessary.

      1. Michael Wojcik Silver badge

        Re: high qubit quantum computing and artifical general intelligence

        For years, the consensus among most of the regulars on comp.lang.javascript was "all Javascript libraries are rubbish". And they had the evidence to support that.

        What's changed since then, for the most part, is that now all Javascript libraries are horrifying tangles of many sorts of rubbish.

    2. Michael Wojcik Silver badge

      Re: high qubit quantum computing and artifical general intelligence

      Hey, QC and AGI worked great for the unicorns. I hear good things about securing systems using bee pollen and feng shui, too.

      Frankly, I don't think much of this entire column. On the one hand, yes, the state of software is deplorable and has been for decades. As others have pointed out, there are a number of contributing causes, but the economics of software rank well up there; it's not simply a case of lacking the will. On the other hand, many people are working hard to improve IT security. It's a big boat and it will take a while to turn it around.

      Sophomoric analogies that don't stand up under a moment's scrutiny and appeals to magical future technology aren't going to help. What does help is understanding security theory and practice, recognizing that the situation is complex and won't be resolved by any simple solution, analyzing the threats and ways to mitigate them, and doing the work.

  7. TVC

    I'm currently reading This is How They Tell Me the World Ends by Nicole Petlroth. I worked in I T for 40 years and some of that involved me in cyber security but this book is a real eye opener and a thrilling read to boot.

    1. hammarbtyp

      I would also recommend Sandworm: A New Era of Cyberwar and the Hunt for the Kremlin's Most Dangerous Hackers by Andy greenberg

    2. Anonymous South African Coward Bronze badge

      Anything and everything have its attack surface. You just have to find a way in.

      I would also recommend reading Masters of Deception.

      https://en.wikipedia.org/wiki/Masters_of_Deception

  8. Greybearded old scrote Silver badge
    Megaphone

    Simplify, simplify

    Bruce Schneier says that complexity increases your attack surface. (It's a long way down in a very long read, but worthwhile.) Yet we add more and more layers, until even a networked document viewer has pretentions of being an operating system. Well above its station, if you ask me. Then there's an imperial shitload of cruft in a processor that harks right back to the mid '70s. If we'd known we were headed here, I doubt that we'd have started there.

    Time for a do-over? Risc-V might be a good start, just on the grounds of minimal history. Then perhaps explore what else we could have done other than C and UNIX, knowing what we want our OS to do for us now. Not that I think UNIX is bad, just to ditch loads of compatibility layers. Personally I'd like to have no middle ground between assembler (bottom tier OS only) and high level languages. With a message passing language VM, similar in concept to Erlang's BEAM, we might even get to use all our CPU cores properly. Bonus!

    One thing I'm certain of, continuing down this path can't result in any improvement.

    1. Anonymous Coward
      Anonymous Coward

      Re: Simplify, simplify

      Well have a +1 for the BEAM VM, the closer to the metal the better. And OTP and Elixir at the programmer's level. It's all stood the test of time.

  9. Anonymous South African Coward Bronze badge

    It is a good idea to redo the whole development process.

    Unfortunately it will take time and money to do so.

    The problem is that most are using off-the-shelf products like libraries, source code snippets and the such, most of which may have unknown vulnerabilities lurking.

    QA do cost money, and it takes time to do a proper QA test. And there always will be a method or way which nobody thought of applying to QA in order to test for vulnerabilities - but hackers do have a lot of time and patience, and will happily try a lot of combinations just to crack and enter a supposedly secure system.

    1. Anonymous Coward
      Anonymous Coward

      And why am I paying for it ?

      I don't have to factor in buying my own fleet of F35s to protect my company - I pay taxes and the money goes on aircraft carriers.

      If these are state level attacks why am I the one that's responsible for the defence ?

      Either cut the defence budget, cut my taxes and put the responsibility on me, or redirect some of those defence $Tn toward defending me from Russian hackers rather than adding a 19th carrier group

      1. Claptrap314 Silver badge

        As the much-misunderstood general testified, "There are no civilians." If you want the government to protect you (ie: be a civilian), then you will need to have government provided built CPU, smart phone, OS, and all apps. No sites available unless they have been approved by the government, (and no changes on them without going through change management.)

        I don't think you will be happy.

        1. Yet Another Anonymous coward Silver badge

          I would just like my government to be on the same side as me.

          Discover a zero-day and keep it to themselves in order to use it against 'domestic threats' - and announce that they knew about it only when the Chinese use it against me !

          1. Claptrap314 Silver badge

            Are you aware that in recent decades, a TLA would occasionally contact a company and say, "Hey, sign this NDA. We need to talk." followed by "We've observed foreign actors compromising your systems in the following fashion. You need to fix that. Quietly." To which the response is, "Huh. I can see that the attack would work, but we don't see any evidence of it being used."

            The general conclusion was that the intelligence services were doing exactly what a rational actor would want them to do in a hostile world.

            But yeah, not always.

  10. ForthIsNotDead

    Rust to the rescue?

    I do wonder.

    Are we on the precipice of being able to deliver reliable software on a large scale? It's in our hands I suppose. Regarding Rust specifically, I note that we've had languages such as Java that have already solved issues such as memory fragmentation and memory leaks, though those languages never solved the shared resources/threading/race-conditions issues that plague so many large-scale commercial software development projects.

    Rust in particular has made great strides in solving these problems, at the expense of some complexity, it could be argued. I do think that, metaphorically speaking, we're standing at the foot of some event horizon in classical computing paradigms. Rust (and it may not be Rust, but some derivative that has similar ideas and conventions, just expressed differently) maybe the start of the next phase of software development technology. Where it's effectively not possible to write the software 'wrongly'.

    It's in our hands. Or, within reach. It's up to us to grab it and use it.

    1. ScissorHands

      Re: Rust to the rescue?

      Rust (or any similar language) can't help with errors in code logic. But it definitely helps with 70% of existing errors, which are of the memory/data-race/side-effect variety and that usually can't be caught in development, only in production. Rust ensures, at least, that if it compiles, it's mostly free from those.

      Cue everyone complaining that Rust is hard and the compiler is slow...

    2. hammarbtyp

      Re: Rust to the rescue?

      Delivering reliable software on a large scale has been achievable for years. I was working on applications that delivered 9 9's reliability in the 90's and most of the internet architecture is based on similar concepts.

      The big problem was it required moving away from the standard software paradigm and use concepts like functional programming, which despite its benefits has (and maybe always) be niche. Generally companies found it easier to shoehorn on features rather than use languages and architectures which natively supported them. While i understand the attractiveness of Rust, i am not convinced it will do any better.

      1. ScissorHands

        Re: Rust to the rescue?

        Look around. You may think all that software was reliable but how stringently was it tested, really? Did you have fuzzers back in the day like we have now? If you need to recompile it today, how much of its behaviour was defined by the choices of the previous compiler about what to do on the several places where a language has "undefined behaviour"?

      2. Claptrap314 Silver badge

        Re: Rust to the rescue?

        Do you even know what 9 9's reliability means? I learned SRE at Google. I consider 6 9's to be theoretical.

        1. Anonymous Coward
          Anonymous Coward

          Re: Rust to the rescue?

          I think they meant "five nines".

          1. hammarbtyp

            Re: Rust to the rescue?

            Nope I meant 9 9s https://stackoverflow.com/questions/8426897/erlangs-99-9999999-nine-nines-reliability

            In truth that is based on system uptime, but the techniques were there to achieve a high level of predictability and uptime. Code supervisors and the ability to hot swap code go a long way, plus the lack of side effects in functional languages

            1. Claptrap314 Silver badge

              Re: Rust to the rescue?

              #1: system uptime has almost 0 to do with service uptime. And we are talking about services.

              #2: system uptime reliability has a hard limit of power supply uptime reliability. (I'll give you the network for free.)

              You can only choose one of those.

    3. Roland6 Silver badge

      Re: Rust to the rescue?

      You jest!

      Rust has effectively exactly the same hole in it as Turbo Pascal had: don't like the limitations of the language then open a hole and do it in assembler.

      You may object, but fundamentally, good coding relies on the programmer maintaining good practice and having a level of informed oversight to ensure such (useful) features aren't being needlessly used and abused.

  11. Anonymous Coward
    Anonymous Coward

    I remember

    In 2002, Bill Gates said exactly this point, and that's when Microsoft went into the 'Trustworthy Computing Initiative' because we'd lost all trust in client/server security.

    In 2012, at the 10 year point, Microsoft released a memo called "At 10-Year Milestone, Microsoft’s Trustworthy Computing Initiative More Important than Ever"

    Next year, at the 20 year point, I expect Microsoft will release a memo that simply says, "Sorry guys"

    1. ScissorHands

      Re: I remember

      That 70% percentage I quoted of vulnerabilities because of memory/data-race/side-effect? Microsoft et al. That's why they are firefighting it right now with Rust and have sponsored the Rust Foundation while developing an in-house equivalent codenamed Verona.

      Apple is working on a Rusty Swift and although Google went to the trouble of hiring the best minds in language development to create Golang, they're still using Rust inside the Fuchsia kernel.

      1. Claptrap314 Silver badge

        Re: I remember

        Tell me. Does Rust prevent SQL injections? XSS? Unsafe object deserialization?

        You've pulled the 70% out of thin air. Rust deals with certain common, but ultimately narrow class of low-level bugs. Certainly, an improvement in many cases, with cost. But you're not going to save Tinkerbell with it.

    2. Michael Wojcik Silver badge

      Re: I remember

      Conventional wisdom in the field is that Microsoft has basically been coasting on Trustworthy Computing for years, particularly since the TC group was moved into Cloud and Enterprise in 2014.

      For the first several years, Microsoft's TCI actually made a huge contribution, at both the large scale with principles and practices like the SDL and the R&D done by the TC group, and all the small-scale work in finding and fixing vulnerabilities across the portfolio. The problem was they were starting from such a dire position with a huge codebase full of technical debt. (They're far from the only organization that was, or still is, in this position, of course.)

      Then after 10 or 12 years the executive team seems to have more or less decided that things were good enough, and they could just maintain that level. They kept things like the SDL and tooling, and there's still some research being done, but the big push seems to be long over. And anyone who reads the bulletins knows they've become much less transparent about vulnerabilities since the XP days.

      There are occasional nice bits like integrating Address Sanitizer into Visual Studio, but the days when software security was high-profile seem to be over. To take one example at random, Visual Studio Code doesn't support checking signatures on extensions. So all those extensions that devs (who are probably running VSC with elevated privileges for some damn reason) are eagerly downloading from the marketplace are one compromised account away from being replaced with malware. If they aren't malware already.

      1. Robert Grant

        Re: I remember

        It would be nice to see articles written about this sort of thing on El Reg.

  12. Yet Another Anonymous coward Silver badge

    Remember the cold war - in the UK

    I remember that the government spent a lot more effort on defending against "domestic threats" than the Russians

    If the government had put as much effort into the CCCP as it did into spying on CND, Greenpeace, the NUM, TUC etc. Although it probably didn't help that at the time all the security services were headed by Russian assets

    Of course things are completely different now.

  13. Claptrap314 Silver badge

    Much sound and fury...

    signifying nothing.

    At the top of the article, you cite the utterly debunked Bloomberg magic chip as credible. And the quality of the analysis doesn't improve much from there.

    I agree with the final paragraph, 100%. But ill-informed busybodies pontificating is not going to help anything.

  14. Someone Else Silver badge

    The last paragraph of the article stated:

    Denial has stopped working. Either we lose the game – or we change it.

    Reminded me of the quote from the movie War Games:

    Strange game. The only winning move is not to play.

  15. Filippo Silver badge

    The problem is not technical, it's philosophical. If you had an engineer who consistently made buildings that collapsed, you would not fix the issue by providing him with stronger structural materials; you would fix it by sacking him and getting another engineer. Or, even better, setting up a legal framework where people who can't build a stable house don't get to build any house at all, and if they do, they are personally liable.

    Similarly, the fundamental issue cannot be fixed with better languages or better compilers or better OSes or better CPUs. Rather, the entire field needs to be rethought. But I don't see any way of doing that without making professional development so onerous that 90%+ of software companies fold, and the rest start charging 10x current prices.

    https://xkcd.com/2030/

    1. Tim99 Silver badge

      " But I don't see any way of doing that without making professional development so onerous that 90%+ of software companies fold, and the rest start charging 10x current prices."

      "You make that sound like a bad thing" (Gene Hunt - Life on Mars). See also: Sturgeon's Revelation"...

  16. Binraider Silver badge

    The critical mass of what you need to know, to be a successful systems programmer is now so large I'd argue it's impossible to stay up to date on every last aspect of how things are being used or misused. The guys that wrote that math library you like? Sure, they are probably pretty knowledgeable. But such libraries might have changes in behaviour over patches. How does your complex playing card tower of interlinked libraries respond? In my case, we have test cases with expected results baked into the software. Changes as a result of dependencies changing then show up in the test case. This is all well and good, but also means a program is never 'done'.

    Infosec world is every bit as hideous as math library arguments and I don't envy anyone trying to stay on top of it. I'm not convinced a human can, in fact, such is the degree of complexity and potential interactions of things not necessarily meant to interact. This leads towards architecturally embedded security over software. Had anyone really got this right yet? Spectre et. Al. very much day no.

    But, more than anything else, as long as the spy agencies want their backdoors baked in, binary blob from big suppliers basically can not be trusted without implicit acceptance of binary blob. Trusting a more open supply chain and auditing it is a whole other kettle of problems, but it is at least auditable.

    And then there's the potential for self writing code in the not too distant, even if early attempt resemble the n monkeys required to write a Shakespeare play problem right now.

  17. Sparkus

    William Gibson might have started it all, John Brunner pushed it forward brilliantly, and Neil Stephenson put a contemporary face to it

    https://www.amazon.com/Cryptonomicon-Neal-Stephenson-ebook/dp/B000FC11A6/

  18. Claptrap314 Silver badge

    Rule #1: The consumer is king

    We've gone through this many times in different permutations. The FDA was created to try to drive out the snake oil salesmen. The professional boards (mostly medicine & law) exist to try to keep the worst practitioners from causing the death of their clients. But in the end, the consumer remains king. We had a perfectly good consumer cell phone with a much better security model than Apple, let alone Android. Remind me, how is RIM's retail branch doing these days?

    Okay, so consumers don't really understand the dangers of bad security. Even if they did, security is a distributed threat--99.9999% secure means 100% insecure. To argue that consumers are willingly going to bear the costs of security is to argue that Communism works. (Hint to the Millennials: even the Pilgrims could not pull it off.)

    How about we start at the other end of the problem: what exactly is meant by "this server is secure"? istoomuch.jpg How about, "this code is secure". Care to back up that claim? May I see your MA in mathematics? Because proving that a piece of code actually does what you want it to do, and nothing else, is at least equal to a thesis. (And, yes, I do have one of those.) That's assuming that the compiler has also been proven. And whatever was used to create the compiler. And the OS. For both. And the cpu. For all. Moreover, your CPU must not only do what the architecture says, but must be side-channel free. That means (and I have the background to say this) either taking a 10x hit to speed, securing not just your code, but all code running on the server--including cross-domain interactions between applications (think: SQL injection), or getting completely new cache architecture.

    Okay, so somehow we have a magically secure programming environment. Luckycharms.img And, you want some new code. You going to hunt down a mathematician to write it? Oh, but maybe we can use the model that engineers use. The mathematicians doesn't have to write the code himself--he can check the work and certify it. No. grumpycat.img Code is not a piece of metal that can be machined into tolerance. It does not have microfractures that only spread at a given rate. There is no procedure to guarantee changes are correct. kurtgodel.img When we are talking about demonstrating security, we are talking about creating valid proofs, and while two might be three times as fast, that's two fully trained mathematicians, not a mathematician and someone else.

    But security really is that important. Why don't we pass some laws & regulations? yeahsure.jpg Just how long will a politician stay elected if he passes a bill that outlaws 99.99999% of existing code?

    We are losing the war. We do need to fight it. But we must focus our efforts where it can be productive. We need public awareness of the pervasiveness of the security threat. Maybe we can get Bill Gates to fund a publicity campaign. I do believe that liability legislation has its place, but the only way that is going to survive is if it is extremely incremental. That is, too little to be effective until some sort of phase change happens. Something in the same spirit as the GDPR, but significantly more limited in relative scope.

    Yeah, I really try not to think too hard about this. Kinda like the Cold War.

  19. Nightkiller

    Ah, yes. Software. The Internet.

    This time, Utopia will be different.

    Yeah, right.

  20. Boris the Cockroach Silver badge
    Boffin

    We've lost

    its as simple as that... by the way I'm a child of the cold war.... 4-8 mins of terror before being vapourised..... or unlucky... being badly injured by flying debris before several rads worth of fallout lands.

    So I started off in programming with things like the ZX81... and the Z-80 chip , that led me to a great discovery, that if I allocated memory for a data file to land in, and if that data file was too big it would over write whatever was in its way regardless.... remind me again just how many m$ patches(and others) have been for f'ing buffer overruns. and thats just one class of programming f ups.

    As I've explained over the years to the readers of el-reg, the level of programming I do at the moment is pretty simple and basic, but it carries one major drawback... the level of destruction I'm capable of if I get it wrong, which makes me (and colleagues) extremely careful programmers.

    Yes we use modula based techniques where we know and have proved that a certain section of code is clear of bugs so we can reuse it elsewhere (spindle to spindle transfers springs to mind) but it takes 4 yrs or so for the mindset of writing code/testing/re-writing the code to develop, plus we have the manglement screaming "is it done yet?" every 5 mins...

    However , for what I've seen of commercial code for software products, all what I go through on a daily basis for engineering the code goes out of the window with the focus on getting the software done ASAP, getting it to cover the basics, and getting money off the customer... and not caring if burns down, falls over, then sinks into the swamp(and taking everyone with it)

    Would professional qualifications help? not really as you'll end up teaching to pass the exam without noticing how fast things move along in software world after all case law based on "Jones vs Regina in 1934" is'nt going to change as fast as "coming soon : windows 10.1114.. sorry update windows 10.1115"

    All we can do is teach to best practice, and teach to program defensivily, and to teach testable modularity.

  21. DS999 Silver badge

    Attackers will ALWAYS win

    They only have to be right once. Defenders have to be right 100% of the time.

  22. Bitsminer Silver badge

    It's complicated

    Well the author's admonition to revise the ways and means of production is well intentioned, it is not practicable.

    Software is complex, and large software systems (hello there smartphone) are made of too many components from too many authors (with widely varying skills) to allow a reorganization of how things are done.

    And security bugs are typically very-small-percentage of all bugs (see https://daniel.haxx.se/blog/2021/03/09/half-of-curls-vulnerabilities-are-c-mistakes/ for an example). Note, Daniel points out 26 vulnerabilities amongst 2311 bugs, 18 of 26 were C mistakes, which is where "half" comes from.

    When the gadget you are making (hello Wifi router) is complicated then the process to build it is also complicated and the components and design are complicated.

    Managing complexity is the key here. Fail at that, you have SLS. Succeed, and you have Falcon 9.

  23. Wi1em

    Did you just say we should rewrite everything in Rust? Like, I agree completely.

  24. ecofeco Silver badge

    You get what you pay for and reap what you sow

    The government and corporations ran off the good coders decades ago along with the good I.T. staff who would enforce safe practices among the users, just to save money and hand out "most favored" contracts. Now we are seeing the consequences.

  25. Roland6 Silver badge

    In 1989, Robert Tappan Morris was arrested and prosecuted under the then brand-new Computer Fraud and Abuse Act.

    Nothing has changed, this is still the normal modus operandi of the US government.

    Over 30+ years we've seen very little evidence of security being taken more seriously by US government agencies; however, anyone found to expose this inconvenient truth will receive the full force of the US government via the US legal establishment.

    A good start would be for judges to start tossing out prosecutions and extradition attempts where those bringing the case cannot evidence that they had implemented basic security commensurate to the level required for the systems they alleged to have been accessed.

  26. Stuart Castle Silver badge

    I think the problem here is this is a worldwide problem. We need to co-operate with each other to resolve it. The thing preventing that is that that a lot of the people who discover security issues work, whether directly or indirectly, for one of the security services/Military intelligence. The various agencies would be very reluctant to release details of any issues they find because they are likely to be exploiting them in systems in other countries, so they aren't going to want them fixed. The staff are likely not allowed to release them.

    Sure, they'll release details of the odd vulnerability, but it's likely just so they appear to be co-operating, and won't be any of the really important vulnerabilities which are potentially useful.

  27. Tron Silver badge

    This column pushes us down a road that leads to national governments controlling and licensing all software and hardware that may be used within their borders. Nothing new could be used until the government had checked that it was OK. To keep us all safe, and for reasons of national security.

    I have every faith that Dido is ready for her next challenge.

  28. Sanctimonious Prick
    Pirate

    Russia! China!

    Right. Russia. China. Whomever. As long as America can identify an enemy, it can provide funds to American companies specialised in protecting America and waging war on America's adversaries, and partly owned by American politicians (hello, Raytheon).

  29. nicboyde

    Violence IS an answer.

    While writing software and using it is as cheap as it is, you cannot hope to regulate it. Or to defend yourself against it.

    You can however make the consequences of writing bad or malicious code so expensive that people will regulate themselves. This is what we do for crimes like theft, or murder. We make the consequences very hard to escape, expensive and public.

    This would mean changing the civil laws that permit authors to avoid responsibility for the consequences of poor code. Make them pay. They'll fix the bugs if it means not getting fined.

    It would also mean pursuing, with full vigour, the bad actors who deliberately publish malicious code. Don't just shut down their botnets: arrest them and jail them. Noisily and publicly. Confiscate their houses and property. Make them pay. There is nothing like enough effort being put into suppressing cybercrime, so it is flourishing. Instead, prosecutors aim at the folk who crack poorly-secured government websites. This is the limit of their interest and concern, and this has to change.

    Lawyers are expensive, and so are policemen, but their cost is but nothing compared to the economic damage caused by shitty code and shitty people.

    1. Anonymous Coward
      Anonymous Coward

      Re: Violence IS an answer.

      Quote: "....writing bad ... code...."

      main()

      { printf("hello, world!");

      return;

      }

      I just compiled this 47 byte program with:

      $ gcc -static -o hello hello.c

      The result was an executable 1702128 bytes long....yup 1.7 megabytes of an executable object file!

      I wrote 47 bytes of code.......how much library code THAT I DIDN'T WRITE got linked?

      ......and you want ME to be responsible for my code? This is a suggestion which will stop everyone writing code!!!

      1. Mike 137 Silver badge

        Crap compiler

        main()

        { printf("hello, world!");

        return;

        }

        executable 1702128 bytes long

        I use an optimising C compiler for PIC (typically 64k memory) that produces exeutables generally smaller than the source file.

        1. Anonymous Coward
          Anonymous Coward

          Re: Crap compiler

          @Mike_137

          Did you notice the word "static" or the word "linked"? Perhaps you might use your "optimising compiler" to create a static executable........smaller than 47 bytes......I don't think so!

          *

          ....and then there's the point about SOMEONE ELSE'S CODE being linked......you may have noticed that @nicboyde was suggesting retribution for code authors........when most code authors are using some code (maybe a lot of code) WRITTEN BY OTHERS.

  30. Anonymous Coward
    Anonymous Coward

    Trusting government vetting of software is easy to get right

    You just have to get lots of them to do it. Have the appropriate agencies for the US, UK, China, Russia, North Korea, Israel, Iran, etc. all check the same code. If they all okay it, it is probably good enough for your use. Probably not needed, but if some of them are also REQUIRED to use code that has gone through the vetting process that implements something they need, it is probably going to be as secure as humans are capable of producing. Giving them the source code would make the job easier for them, but might not be necessary.

  31. cortland

    Woe are we...

    I still have some slide-rules. Need I make or purchase a one-meter abacus?

    NB: I lost all my marbles a LONG time ago.

    Cortland

  32. Martin Gregorie

    You've all missed half the problem

    Its all very well, as well as true, to say that crap code means everything is insecure as hell, but that's only addressig half the problem pointed out in the article .

    The other half, which every commentard so far has ignored, is the amount of deliberate misinformation put on the intertubes by malicious actors and then repeated blindly by people too ignorant or lazy to think about the garbage they're amplifying. For keeping this sort of stuff from contaminating the interwebs, software mistakes and omissions are totally irrelevant.

    The only way to keep lies and other malicious garbage off the 'information highway' is to take a leaf out of traditional journalism and prevent publication of anything that is not directly and unambiguously attributable to its author: think about it: in a traditional newspaper or broadcast channel, every article or story carries its authors name just as no 'letter to the editor' is published without being checked first. This, done properly, makes everybody directly responsible for what they say, write or publish. THIS is the true meaning of Free Speech, which has nothing to do with lies, abuse, etc written by some anonymous toe-rag.

    Requiring anything published on the internet to carry the author's unique, unfalsifiable identity, thereby making the author liable for their output would certainly kill off much of the unattributable nastiness pouring out of the social media, but of course that will never happen because the owners of the social media sites are making too much money from monetising it.

    Whistle blowing? This has nothing to do with Free Speech because, for starters, whistle-blowers messages should never be published: they are, or should be, messages sent to some person or authority who can and will take action to fix the thing being complained about.

  33. Robert Grant

    I don't think we can blame "bad actors"

    The cast of Hollyoaks aren't all that tech-wise.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like