back to article A Code War has replaced The Cold War. And right now we’re losing it

Remember the Cold War? For me, growing up in America meant living under the permanent, intangible threat of sudden vaporisation by thermonuclear attack. It added a piquant pointlessness to everything. Ashes, ashes, all burn down. Yet the world stubbornly refused to end. Communism collapsed, Western neoliberal democracy seemed …

Page:

  1. amanfromMars 1 Silver badge

    Sometimes it is best to lose graciously ... in order to save oneself from totally unnecessary pain

    Space may also point the way toward a solution with slides into an economic and military superiority. It may be possible that a similar approach - using "moonshot" technologies like artificial general intelligence and high qubit quantum computing - could place attackers so far ahead of defenders that defence and opposition and competition becomes effectively impossible and ruinously expensive and quite certain to guarantee a changeover into what would definitely be Greater IntelAIgent Gamesplay.

    What's not to like?

    1. amanfromMars 1 Silver badge

      Re: Sometimes it is best to lose graciously ... to save oneself from totally unnecessary pain

      Don't forget, El Regers, a silent downvote on pages and comments here, is a golden opportunity crazily squandered to share with all readers in far away fields, an alternative opinion and views which may or may not be full to overflowing with common sense worth listening to.

      Such is to be highly regarded and much prized, whether they prove themself to be valuable or worse than just worthless and excessively exclusively self-serving.

      1. HildyJ Silver badge
        Boffin

        Re: Sometimes it is best to lose graciously ... to save oneself from totally unnecessary pain

        I'll bite.

        First, there's no way of knowing if we're losing since our TLAs don't do data dumps.

        Second, hacking is relatively cheap and we won't be bankrupting Russia or China by upping our cyber budget.

        Third, attacking a state directly is a nuclear option (figuratively and, in some discussions in the past, literally).

        Which brings me to my conclusion. What kept the Cold War from going hot was the MAD concept - Mutually Assured Destruction. We need to reach an understanding with Putin and Xi about what is acceptable and what is not. Our big stick can be the Russian pipeline explosion and the Iranian centrifuge sabotage (neither of which we acknowledge).

        But we have to accept that not all state sponsored hacking rises to that level just as we don't attack Russia for the actions of their clients, the Taliban, in Afghanistan. For lesser hacks, I'd add doxing to the sanctions but, ultimately, it's a question of making our sites more secure.

        1. amanfromMars 1 Silver badge

          There is Only One Holistic System of Systems ... Networking AI Colossus

          Hi, HildyJ ...... and Welcome to another world.

          To imagine that any State assembly/Parliamentary body/Presidential executive/national collective anywhere, East or West, is responsible for the news that supposedly advises us of want we are to believe in and to do, with the serial infliction of pain and suffering or reward and pleasure on vast swathes of others, as surely the news and media certainly currently does, is something to consider is much more a perverse convenience rather than identifying the actual inescapable factual reality.

          The latter in truth, and actual factual augmented virtual reality, may very well be stated in this mightily overlooked and inconveniently prescient monologue ‽ . ......... "The World is a Corporation" (Network, 1976)

          No matter which though you might conclude to be the more acceptable and easiest to believe the more likely and possible and therefore probable, it will have just a few beings arranged as around a round table with a board of governors/systems drivers with the brightest and smartest of directors leading by virtue of their commanding contributions ensuring continuing stealthy supremacy of both their invisibly cloaked, renegade rogue and controlling non-state actor future activities with CHAOS.

          And all here on El Reg should already be well aware of what CHAOS certainly is. It has often been freely shared to keep y'all abreast and au fait with virtual developments so that ignorance is not rendered your only available guide to future shenanigans/upcoming COSMIC events.

          Test yourself here. Fill out what you think the above and following two expand into. The correct answers are easy enough to find, and you can have a lot of fun being silly and filling in your view. Take care though, because that can easily tell those who would be interested to know, a great deal more about you than maybe you would like them to possess ...

          CHAOS .......

          COSMIC .......

        2. Anonymous Coward
          Anonymous Coward

          Re: Sometimes it is best to lose graciously ... to save oneself from totally unnecessary pain

          They could actually stop hacking completely just by shutting down the internet and then rebuilding it without connections to those countries that condone hacking.

          That being said that would include pretty much everyone else where it comes to state sponsored attacks.

          So perhaps the answer is for everyone to stop cyber attacking each other and save their tax payers money for something constructive instead.

    2. Robert 22

      Re: Sometimes it is best to lose graciously ...

      "using "moonshot" technologies like artificial general intelligence and high qubit quantum computing - could place the defenders so far ahead of the attackers that assault becomes effectively impossible"

      What if the attacker uses the same technologies?

      We have a complexity problem - everything is so complex, there are bound to be weaknesses.

  2. KittenHuffer Silver badge
    Linux

    The Code War?!?

    I remember the Cod war!!!

    South of the Equator the winner was ----------------->

  3. Mike 137 Silver badge

    Yet another uncomfortable truth

    Unfortunately, the harsh reality is that software development is the only branch of engineering that doesn't have established standards and methods that are proven to yield trustworthy results. It took a couple of hundred years for such standards and methods to be worked out in civil, mechanical and electrical engineering, and it may just be that software development hasn't been around for long enough yet. But the big difference is the extent of deployment of the technologies while they are still immature. Software now permeates almost everything, but we aren't yet able to assure its adequacy. That's the real problem we face, and it can only be solved by establishing standards and mandating their application.

    In no other branch of engineering can a totally self-trained practitioner be taken on trust and tasked with delivering mission critical (or indeed life critical) systems. It's about time we stopped this practice in software development for those domains. So we need enforced standards that deliver safe systems, as we have for even for technicianship disciplines such as electrical installation and gas fitting. Nobody would argue for mandatory certification of games developers, but I would most certainly insist on it for those developing things that can affect livelihoods and lives, and such certification should ensure understanding of first principles, not just knowledge of the knobs and levers of proprietary tools. What we have had to date instead is a worrying trend of successive generations of tools that make it easier to deliver results without paying attention to the essential first principles - a deskilling that perpetuates the very problem we should be trying to solve.

    .

    1. Paul Crawford Silver badge

      Re: Yet another uncomfortable truth

      It is far worse than just the lack of "sound principles" being used, these days software comes with license agreements that abdicate responsibility of the consequences of crap code.

      What other discipline would get away with that?

      Add to that a lot of connectivity and inter-dependence being added is driven my marketing droids (or worse, advertising brokers) and the future looks bleak indeed.

      1. This post has been deleted by its author

      2. jmch Silver badge

        Re: Yet another uncomfortable truth

        The real root cause is lack of accountability. If a bridge or building collapses, the engineers / architects / builders involved in the faulty design / construction / certification etc are legally, and in some cases, criminally liable. Same with doctors / patients etc etc.

        Class software in (eg) 3 classes; life critical (failure causes loss of life or severe risk thereof) , economically critical (failure causes monetary damage above a certain threshold) , and not critical (failure 'merely' causes a bunch of pissed-off users). Apply criminal liability for failures of first class and civil lia for those of second class. This type of software will rapidly improve.

        The flip side of course, is that the world has become addicted to software that is incredibly cheap for all the immense convenience, without properly realising it. So critical software would become a lot more expensive, and slower to roll out

        1. Aitor 1 Silver badge

          Re: Yet another uncomfortable truth

          Yes, that worked grran in grenfell, the previous fires, the huge ammount of prefab buildings that were seriously defective in the 60s to 80s, etc etc.

          In practice they are still not accountable, as eeveryone did "best practices in the field", nobody caused the deaths, but between all of them they caused it. We would have the same shenanigans, with "lessons learned action taken" statements and thats it.

          That does not mean we should be happy with code cowboys doing their stuff, but most of the code cowboys I know do have a relevant degree, masters or even phd.

    2. ScissorHands

      Re: Yet another uncomfortable truth

      Several forests have been decimated to produce MITRE code standards, but nobody follows them (Minimal Viable Product is the law of the land, security be damned) and even following them to the letter, C and derivatives should be classified "Unsafe at any speed", even (or especially) legacy code. Rip it all up and start again with GC/RefCount/RAII languages.

      Nuke it from orbit, it's the only way to make sure.

      1. Anonymous Coward
        Anonymous Coward

        Re: Yet another uncomfortable truth

        So many people are producing code and shouldn't have been left close to a compiler.

        When I used to be a dev, back in the 90s, I always was baffled by the amount of mistakes of people that never bothered to even RTFM.

        C code, makefiles, all borrowed from largely bigger project, that couldn't work together, improper usage of unix tools, etc ...

      2. Someone Else Silver badge

        Re: Yet another uncomfortable truth

        Something about a poor workman and tools...

      3. Loyal Commenter Silver badge

        Re: Yet another uncomfortable truth

        I'd like to see you build embedded software, or code that must run in a resource-limited environment with a high-level language.

        Sometimes bit-bashing in C is the only way to do it, but I am inclined to agree that when it isn't the only way to do it, most of the time it's the wrong way.

        1. Someone Else Silver badge

          Been there, done that, got the tee-shirt

          I'd like to see you build embedded software, or code that must run in a resource-limited environment with a high-level language.

          I don't know who the elliptical "you" is in your statement, but try this: Piece of medical equipment written for an 80186 in C++ (OK, with a small BSP written in assembler), hand-written multitasking kernel (also written in C++), full 20-color GUI (this was the middle 90's; screens that would support this were not ubiquitous), multiple languages and fonts. So, let's see now:

          --> Embedded software: Check

          --> Resource limited environment: Check

          --> High-level language: Check

          --> Gov't regulated market: Check

          --> A metric buttload of oversight, review, and team and individual discipline: Check, Check and Check.

          --> Wrong way to do it: Definitively NOT check.

          1. Loyal Commenter Silver badge

            Re: Been there, done that, got the tee-shirt

            I wouldn't call C++ a high-level language here though, it's being compiled down to machine code, and still has low-level memory management (malloc, and pointers). It's really just C with-bells-on. Perhaps wrongly, I refer to C and C++ interchangeably.

            Languages like Java or C# compile to bytecode, which runs in an interpreter. Yes, you can run Python on embedded devices, but again, this relies on an interpreter, which makes it a lot slower than something compiled to run on bare metal.

            I'm not thinking of anything so sophisticated as something that can run a GUI either - things like Arduinos and Pi Picos that are being used to control things at a very basic level, and which have very limited memory and storage to work with. Things where available memory is measured in kilobytes.

            Sometimes you just don't have the luxury of the headspace to run even a cut-down kernel if you need to use that memory for things like shifting a lot of data about between I/O channels.

            1. Anonymous Coward
              Anonymous Coward

              Re: Been there, done that, got the tee-shirt

              Tektronix storage scopes of the 80-90s. Resource constrained but networked multi-processor systems (about 7 CPUs in the lower end models). All coded in smalltalk. Nice GUI, fast, responsive and they never ever crashed. Try achieving that in C!

    3. a_yank_lurker Silver badge

      Re: Yet another uncomfortable truth

      As someone trained in other STEM fields now in IT, it is not being 'self-trained' that is the actual problem. Any junior programmer will need training on proper security techniques needed for the applications they are working on. This is not likely to be covered in any real depth in their course work.

      One major difference between software engineering is personal, professional liability requirements as evidenced by PE licenses. The requirements in Feraldom are generally an appropriate STEM degree, pass the 'Engineer In Training' test, work for several under the direct supervision of a PE, pass the 'Professional Engineer' test. The PE is only one who has the legal authority to certify the work meets all the standards and legal requirements and the project con proceed. As a non-PE, I can only work below one who would be supervising my work. The PE is personally and professionally liable for anything approved. There is no equivalent in software of a PE.

      1. Mike 137 Silver badge

        "it is not being 'self-trained' that is the actual problem"

        You're right. It's self-trained practitioners being taken on trust without any standardised formal way of validating their competence. Those trained by others undergo continuous (even if only informal) validation during their training, but the self-trained don't have this advantage, so they may not in the words of Dirty Harry even "know their limitations" themselves.

    4. Dan 55 Silver badge
      Stop

      Re: Yet another uncomfortable truth

      Why are we beating ourselves up over this? Customers expect security but don't want to pay for it, sales managers negotiate the highest price possible and try and reduce internal costs to the minimum, project managers just want it fucking done yesterday.

      It's a whole pile of stupidity on top of developers' backs. That's what's got to change.

      1. Mike 137 Silver badge

        "Why are we beating ourselves up over this?"

        Because people are dying already due to it. We're not just talking about desktops and phone apps here. Boeing Dreamliner generators shutting down due to a control unit counter overflow (fortunately caught before an accident); Airbus A300 in Seville losing throttle response due to missing parameters in an ECU (half the flight crew killed); Boeing 737 Max (no detail needed). Plus the failures of "semi-autonomous" vehicles to spot obstructions in the road before running into them; ordinary vehicles requiring regular "updates", etc. etc.

        The same "standards" of quality seem to apply across the development landscape and it's starting to matter a lot.

        1. Dan 55 Silver badge

          Re: "Why are we beating ourselves up over this?"

          And how can individual developers who care about the software they're writing and want to produce better work change corporate culture which is minimum viable product, lowest cost, fastest time to market, and box ticking without any meaningful work done behind it.

          Answer - they can't. Corporations and every single employee must be forced by law to follow standardised procedures. That's the only way there is going to be change.

    5. Loyal Commenter Silver badge

      Re: Yet another uncomfortable truth

      I'd argue that there are established standards and methods. The issue is that they are not enforced and are seen as a cost by those holding the purse-strings.

      We have building standards and inspections to make sure unscrupulous builders don't cut corners and build unsafe structures, but we have no equivalent in software engineering. This isn't because the correct techniques aren't known. SOLID, design patterns, test-driven development, Agile (done properly) and so on. It is because the bosses tell us, "just make it work, I don't care how".

      The problems start at the requirements gathering stage as well. Adequate security needs to be designed into software, otherwise it gets bolted on afterwards as an afterthought. Again, nobody wants to pay for proper analysis before a line of code is written. Would you let a builder loose to build you a house without getting an architect to draw up plans first?

      The root of the problem is that doing things properly isn't required. Not doing it properly is often cheaper, so there is a financial pressure on businesses to cut corners to compete. I'm all for legislation that says commercial software should meet certain regulations, such as requirements being fully documented and signed off by a professional business analyst, security considered at the design stage and specified, segregation principles followed, code documented, and so on.

      We'd all end up with better software, and most developers I know would love to be given a decent spec to work from without having to do the analysis as we go along.

  4. CrackedNoggin Bronze badge

    Is it possible that companies that take security seriously will survive while the rest perish? Of course, there is also "bad luck".

    1. Anonymous Coward
      Anonymous Coward

      "bad luck" is essentialy a historically accepted term meaning "it's too complex and I couldn't account for all of the variables" and should generally exist in a sentance with "perhaps if we had of sacrificed a goat on that day to the gods it wouldn't have happened" to show how out of place the notion is in the modern world.

      In the aviation industry it didn't take them long to come to the conclusion that wings falling off planes had much to do with design or construction failures and little to do with good or bad fortune. The "bad luck" of aircraft crashing has largely been systematically been engineered out of existance with unsafe aircraft grounded until they are made safe rather than pilots chanting incantations over good luck charms to invoke the gods good fortune and blessings for your journey, and the same applies to cars, trains, structures and pretty much anything else related to engineering.

      In IT terms "bad luck" in security tends to mean that there was a serious lack of security; if the enviroment you are securing is "too complex and you can't account for all of the variables" to go by my term above then one simple answer is to reduce the attack surface to an extent that is manageable.

      That exchange cockup the other day? It was via exchange allowing you to exploit errors in the coding of the login page without sending any authorisation.

      It's trivially protectable against; require your legitimate users to connect to your network via a VPN. The attack surface of your network is then reduced to the VPN login (and SMTP) both of which can be secured with technology and proceedures that were boring (and secure) twenty years ago. If you were doing this then you were effectively impervious to that entire class of attack unless the attackers start on the inside of your network. (which should be guarded against by other access control measures)

      Meanwhile plenty of companies (Over fifty thousand?) got hit because they staked a bet that code thrown out at an ever increasing rate with ever decreasing quality would be 100% safe to expose directly to the internet.

      And the internet these days is a very, very hostile place if my firewall logs are anything to go by.

      1. Claptrap314 Silver badge

        " If you were doing this then you were effectively impervious to that entire class of attack unless the attackers start on the inside of your network. (which should be guarded against by other access control measures)."

        Seriously? Were you found in a cabbage patch this morning?

        Even if, by some act of magic, your VPN was perfectly secure, that does close to 0 about one of your users who mistakenly clicked on an ad or clickbait article and now has been rooted.

        Your user's machines have been compromised. All of them. For quite some time. Now--explain your security posture with twenty-year old procedures is adequate.

        We still are not settled on exactly what should be considered proper MFA. Keeping up is going to matter for a while.

        1. Anonymous Coward
          Anonymous Coward

          That particular subset of information about perimiter security doesn't give you any information about what other security measures I have in place.

          What I am saying is not particuarly controversial, I simply say that if you write down possible methods you can devise to attack your network then you can also come up with ways of making doing that bloody difficult for somebody else and thus reshape the security landscape in your favour. You don't even need to spend lots of money.

          For instance, Group Policy can eliminate vast swathes of attacks without spending a penny; if you download the office GPO's then you can force a GPO for office to disable unsigned macros, which instantly eliminates the threat of macro viruses. (which el reg keeps reporting on, so I assume people still get them)

          You can also prevent office from downloading or running content downloaded from the internet that somebody has embedded within a document, and there goes another class of attacks generally without inconveniencing your end users.

          Applocker/Software Restriction policies can prevent people from running things, so by blocking users from running unauthorised things then you can in fact prevent people from getting a virus through running a trojan, or by visiting a webpage with a dodgy advert since both web browsers and email clints execute files in the %temp% dir, and by blocking .exe files etc from running here then you can stop them running in the first place, rather than allowing everything to run and then operating a blacklist of files to search for via your anti virus scanner, which is a never ending game of whack a mole. Limit the users to running things from %program files%, %windir% and your authorised files on the network and you can basically write another entire threat class off entirely if you are restrictive enough.

          And that's really only scratching the surface of what you can do with freely available tools built into every version of windows installed on the planet. But you'd have had to have sat down and looked at the freely available options that you could configure and figure out what you'd need to do in your enviroment to protect the users without preventing them from working. (because after all, you can apply policies to groups or even individual users...)

          Doing none of this and then getting cryptolocked/hacked/etc and saying "oh, that was bad luck" suggests to me not "bad luck", but "bad planning".

    2. batfink Silver badge

      Unfortunately...

      While this would be an ideal result, unfortunately this isn't what will happen in practice.

      As we all know proper, security comes at a cost. Securing your perimeter is a cost, securing your code is a cost, etc etc.

      It's much cheaper to turn out unsecured crap. So, companies who take security seriously are at a financial disadvantage against those who don't - and so the former are likely to be more commercially successful.

      The actual cost of bad code isn't as bad as we would all hope. Remember TalkTalk? Did they go out of business, as they deserved? Fuck no. So, they can just go on being insecure and making more of a profit than their secure competitors.

      We would all like to see this done properly. Unfortunately the chancers who actually run these firms have a different view. IMO it'll only be when the regulatory penalties outweigh the cost of security that we'll see any real change.

      1. Keith Oborn

        Re: Unfortunately...

        And here is the nub. I was "inside" TalkTalk just after that event, and have also been "inside" their major competitors. It was pure chance that they got hit and the others didn't, as all have similar skeletons in the cupboard. The same will apply to any company that has acquired a smaller one. Security audits on acquisitions are slow and expensive, and the combination of accountants and shareholders won't wear them.

        Then we have the overall problem of software (and hardware/firmware) quality. Not only is this expensive - which means doing it puts you at a huge disadvantage in a competitive market - but also getting your development (*why* is it not called *engineering* I wonder ;-) to care is very hard. IN may last company the CEO gathered the entire team after a major release. "RIght: there are some bugs in that one. You will fixe them before we start on the next one" unanimous response "OH, we just want to work on the new stuff". Guess what happened? And those were *obvious customer facing* bugs.

        Until there is solid regulation of this industry - similar to aviation - we won't see any improvement. The likes of BCS with their "standards" are flies buzzing round a dinosaur. And yes, this sort of regulation will seriously slow things down. That is a *good* thing. The most execrable mantra in the industry is "move fast and break things". What if the thing that is moving fast is, say, a 737MAX?

  5. David Glasgow

    It's a double bluff?

    Cunning plan, in a nutshell:

    We're in, and we've got what we need

    Ok. In a stupid way, say we're going to do something stupidly

    Job for a Politician?

    Yup

    Should we actually do something stupid?

    It's always an option.

    Misdirection?

    Yup. And they'll carry on thinking we're idiots.

    Bonus!

  6. Pascal Monett Silver badge

    high qubit quantum computing and artifical general intelligence

    If we wait on those to solve our connectivity problems, we might as well unplug everything.

    I don't know what the final solution is, but a good start is to stop using other people's code with blind trust. Oh sure, take a module from GitHub, by all means, but don't link to it. Bring it in on your dev server, check the code, test it to see if it works. If it is suitable, then port that to your production environment.

    If there's an update on GitHub, start over.

    Yes, it is tedious and time-consuming. The alternative is SolarWinds123.

    Your choice.

    1. DJV Silver badge

      Re: high qubit quantum computing and artifical general intelligence

      Yes, this ^^^ absolutely!

      Recently, I've been looking into web-based page turning/flipping code. Most of the available options seem to come as black boxes that hauled in God-knows how many extra libaries (and all the unknown cruft that comes with them) and each one looks like it probably only does 95% of what I want (but each does a DIFFERENT 95%) - but adding the extra necessary 5% is probably impossible without weeks of work.

      Two weeks ago I found a simple piece of code that was about 7 years old and, while it was buggy and partly relied on the abandoned CSS Regions proposal that only Chrome implemented for a short while, I managed to pick it apart myself, gained a full understanding of how it worked and fixed it up so that it now works properly in a modern browser (all in a couple of days). I now have a usable tool that stands alone, has a tiny footprint, and can be maintained and improved by myself as necessary.

      1. Michael Wojcik Silver badge

        Re: high qubit quantum computing and artifical general intelligence

        For years, the consensus among most of the regulars on comp.lang.javascript was "all Javascript libraries are rubbish". And they had the evidence to support that.

        What's changed since then, for the most part, is that now all Javascript libraries are horrifying tangles of many sorts of rubbish.

    2. Michael Wojcik Silver badge

      Re: high qubit quantum computing and artifical general intelligence

      Hey, QC and AGI worked great for the unicorns. I hear good things about securing systems using bee pollen and feng shui, too.

      Frankly, I don't think much of this entire column. On the one hand, yes, the state of software is deplorable and has been for decades. As others have pointed out, there are a number of contributing causes, but the economics of software rank well up there; it's not simply a case of lacking the will. On the other hand, many people are working hard to improve IT security. It's a big boat and it will take a while to turn it around.

      Sophomoric analogies that don't stand up under a moment's scrutiny and appeals to magical future technology aren't going to help. What does help is understanding security theory and practice, recognizing that the situation is complex and won't be resolved by any simple solution, analyzing the threats and ways to mitigate them, and doing the work.

  7. TVC

    I'm currently reading This is How They Tell Me the World Ends by Nicole Petlroth. I worked in I T for 40 years and some of that involved me in cyber security but this book is a real eye opener and a thrilling read to boot.

    1. hammarbtyp Silver badge

      I would also recommend Sandworm: A New Era of Cyberwar and the Hunt for the Kremlin's Most Dangerous Hackers by Andy greenberg

    2. Anonymous South African Coward Silver badge

      Anything and everything have its attack surface. You just have to find a way in.

      I would also recommend reading Masters of Deception.

      https://en.wikipedia.org/wiki/Masters_of_Deception

  8. Greybearded old scrote Silver badge
    Megaphone

    Simplify, simplify

    Bruce Schneier says that complexity increases your attack surface. (It's a long way down in a very long read, but worthwhile.) Yet we add more and more layers, until even a networked document viewer has pretentions of being an operating system. Well above its station, if you ask me. Then there's an imperial shitload of cruft in a processor that harks right back to the mid '70s. If we'd known we were headed here, I doubt that we'd have started there.

    Time for a do-over? Risc-V might be a good start, just on the grounds of minimal history. Then perhaps explore what else we could have done other than C and UNIX, knowing what we want our OS to do for us now. Not that I think UNIX is bad, just to ditch loads of compatibility layers. Personally I'd like to have no middle ground between assembler (bottom tier OS only) and high level languages. With a message passing language VM, similar in concept to Erlang's BEAM, we might even get to use all our CPU cores properly. Bonus!

    One thing I'm certain of, continuing down this path can't result in any improvement.

    1. Anonymous Coward
      Anonymous Coward

      Re: Simplify, simplify

      Well have a +1 for the BEAM VM, the closer to the metal the better. And OTP and Elixir at the programmer's level. It's all stood the test of time.

  9. Anonymous South African Coward Silver badge

    It is a good idea to redo the whole development process.

    Unfortunately it will take time and money to do so.

    The problem is that most are using off-the-shelf products like libraries, source code snippets and the such, most of which may have unknown vulnerabilities lurking.

    QA do cost money, and it takes time to do a proper QA test. And there always will be a method or way which nobody thought of applying to QA in order to test for vulnerabilities - but hackers do have a lot of time and patience, and will happily try a lot of combinations just to crack and enter a supposedly secure system.

    1. Anonymous Coward
      Anonymous Coward

      And why am I paying for it ?

      I don't have to factor in buying my own fleet of F35s to protect my company - I pay taxes and the money goes on aircraft carriers.

      If these are state level attacks why am I the one that's responsible for the defence ?

      Either cut the defence budget, cut my taxes and put the responsibility on me, or redirect some of those defence $Tn toward defending me from Russian hackers rather than adding a 19th carrier group

      1. Claptrap314 Silver badge

        As the much-misunderstood general testified, "There are no civilians." If you want the government to protect you (ie: be a civilian), then you will need to have government provided built CPU, smart phone, OS, and all apps. No sites available unless they have been approved by the government, (and no changes on them without going through change management.)

        I don't think you will be happy.

        1. Yet Another Anonymous coward Silver badge

          I would just like my government to be on the same side as me.

          Discover a zero-day and keep it to themselves in order to use it against 'domestic threats' - and announce that they knew about it only when the Chinese use it against me !

          1. Claptrap314 Silver badge

            Are you aware that in recent decades, a TLA would occasionally contact a company and say, "Hey, sign this NDA. We need to talk." followed by "We've observed foreign actors compromising your systems in the following fashion. You need to fix that. Quietly." To which the response is, "Huh. I can see that the attack would work, but we don't see any evidence of it being used."

            The general conclusion was that the intelligence services were doing exactly what a rational actor would want them to do in a hostile world.

            But yeah, not always.

  10. ForthIsNotDead

    Rust to the rescue?

    I do wonder.

    Are we on the precipice of being able to deliver reliable software on a large scale? It's in our hands I suppose. Regarding Rust specifically, I note that we've had languages such as Java that have already solved issues such as memory fragmentation and memory leaks, though those languages never solved the shared resources/threading/race-conditions issues that plague so many large-scale commercial software development projects.

    Rust in particular has made great strides in solving these problems, at the expense of some complexity, it could be argued. I do think that, metaphorically speaking, we're standing at the foot of some event horizon in classical computing paradigms. Rust (and it may not be Rust, but some derivative that has similar ideas and conventions, just expressed differently) maybe the start of the next phase of software development technology. Where it's effectively not possible to write the software 'wrongly'.

    It's in our hands. Or, within reach. It's up to us to grab it and use it.

    1. ScissorHands

      Re: Rust to the rescue?

      Rust (or any similar language) can't help with errors in code logic. But it definitely helps with 70% of existing errors, which are of the memory/data-race/side-effect variety and that usually can't be caught in development, only in production. Rust ensures, at least, that if it compiles, it's mostly free from those.

      Cue everyone complaining that Rust is hard and the compiler is slow...

    2. hammarbtyp Silver badge

      Re: Rust to the rescue?

      Delivering reliable software on a large scale has been achievable for years. I was working on applications that delivered 9 9's reliability in the 90's and most of the internet architecture is based on similar concepts.

      The big problem was it required moving away from the standard software paradigm and use concepts like functional programming, which despite its benefits has (and maybe always) be niche. Generally companies found it easier to shoehorn on features rather than use languages and architectures which natively supported them. While i understand the attractiveness of Rust, i am not convinced it will do any better.

      1. ScissorHands

        Re: Rust to the rescue?

        Look around. You may think all that software was reliable but how stringently was it tested, really? Did you have fuzzers back in the day like we have now? If you need to recompile it today, how much of its behaviour was defined by the choices of the previous compiler about what to do on the several places where a language has "undefined behaviour"?

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2022