back to article Forget quantum and AI security hype, just write bug-free code, dammit

Every year, the RSA Conference in San Francisco brings out the best and the brightest for its crypto panel, and the view from the floor was simple. Ignore the fads and hyped technology, and concentrate on the basics: good, clean, secure programming. The panelists were unimpressed with recent moves to build artificially …

  1. Filippo Silver badge

    I suspect that creating AI and quantum cryptography is easier than writing demonstrably bug-free code. If anything, because nobody is going to fling megabucks at you when you say "I'm going to write code that doesn't suck!"

  2. Dr Scrum Master


    It doesn't have to be completely bug-free, but code should at least be resilient.

    1. Anonymous Coward
      Anonymous Coward

      Re: Resilient

      Some bugs will make your code exploitable, even if it is resilient enough it won't crash. Some bugs will never crash an application, or make it malfunctioning, and in some ways, they are the worst because they can go unnoticed for a long time.

  3. Caff


    Coming of age time for design by contract?

  4. Robin

    Expo Photo

    Was that picture taken on the way to the JPEG Compression Artifact Conference 2017? :-D

  5. Anonymous Coward
    Anonymous Coward

    What would really help is people not using programming languages that trivially permit things like unchecked buffer overflows. Like the C language family, for instance.

    1. Anonymous Coward

      You mean is better to use languages like Java?

    2. Anonymous Coward
      Anonymous Coward

      Been there, done that. For a very very secure system I worked on in the early 90's we used Pascal for this very reason. It was a horrible experience, with formal design, a crap language and 100% code branch testing (even the branches that were virtually impossible to reach). It was a hideously costly and slow development for what would otherwise been a quick coding job. The only fun part was writing the test harnesses to simulate the error conditions that could push the code into all the nooks and crannies.

      At the end of the day the result was something very secure that only the MOD could afford.

      1. Mage


        Like BASIC (cut down Fortran) it was invented purely for TEACHING. As such it was better, but not suitable for the "real" world. Turbo Pascal helped. By early 1990s there were very many suitable languages for secure, military, aerospace and telecoms. Pascal wasn't one of them. Even Modula-2, Ada or C++ with a scanner to check for people misusing it as C would have been far batter. Turbo Pascal became Delphi as it incorporated much from Modula-2, but was too limited compared to other languages.

        1. Anonymous Coward
          Anonymous Coward

          Re: Pascal

          It may turn out that those good practiced enforced for teaching, are also good for writing safe applications. While C was designed for ease the compiler work - and we see the security results.

          Modula evolved from Pascal - in which ways Pascal as improved (strings, objects, etc.) and implemented by Borland was "too limited" compared to other languages? It's quite alike C++, just without the bad design choices of C++ - i.e the faulty model which needed a plethora of "smart pointers" to make things work.

          The biggest issue was having a shaky company behind, especially since Microsoft scorched it in the 1990s when it was becoming a too dangerous competitor. It's no surprise Hejlsberg brought many Delphi ideas into C#...

  6. Mage

    1980s computer science

    We seem to have applied very little of what was learnt in 1970s and 1980s.

    Most code is shocking quality.

    Agile promotes bad practices.

    Too much emphasis on "pretty GUI" rather than ergonomic and Functional and not enough on separation of the application and OS into functional services with secure resilient APIs that don't explicitly use pointers.

    Not enough time is spent on design, especially internal interfaces.

    AI is a dream.

    Quantum computing is for very narrow applications, as such if ever mainstream it will be a co-processor, such as 387 FPU was for the 386.

    1. Primus Secundus Tertius Silver badge

      Re: 1980s computer science


      Well said!

      I partly blame modern education, which tells children to be creative rather than to check their work.

      Also, nobody wants to pay for quality, and they expect bug fixes as part of the service.

      In my own programming career, I saw many poorly defined interfaces that did not logically separate various aspects of the requirements. It is difficult to get a poor interface working free from bugs.

      1. JeffyPoooh

        Re: 1980s computer science

        Agreed. I learned to code in the very-late-1970s and early-1980s.

        First course was with punched cards. Took quite a few course.

        When I write code (not my profession), it's "mic drop" and walk away.

        Granted, they're smaller projects. Only 30-feet of code. Including all the built-in self-checking.

        It's really not that difficult.

        1. Doctor Syntax Silver badge

          Re: 1980s computer science

          "First course was with punched cards."

          Maybe that's the secret. Only having 3 compile/run slots a day helped concentrate the mind.

          1. Androgynous Cupboard Silver badge

            Re: 1980s computer science

            Three slots a day? My Dad used to have to post his punched cards to the nearest computer. Which, as he was in New Zealand in the 70s, was in Australia.

            I imagine they checked their work quite thoroughly before posting.

      2. Doctor Syntax Silver badge

        Re: 1980s computer science

        "I saw many poorly defined interfaces that did not logically separate various aspects of the requirements."

        I have to say, though that the worst instance I saw of that was back in the '80s. It gave all of a customer's users access to all parts of the application irrespective of their responsibilities and all too often the functionality needed for one user screen was associated with another. Most of the 9 months I spent with that firm before bailing were taken up with starting to sort that out.

    2. Moosh

      Re: 1980s computer science

      "Agile promotes bad practices."

      I could not agree more. I'm a test analyst at a pension company and I am genuinely shocked at the stories of agile lifecycles i hear from my friends working agile projects through Deloitte and SQS. Half the time the people don't even have a proper design or even know how the code should work. Testing properly is impossible and a lot of the time the tests aren't even ran because they don't have any auditing system and there's no recording of the test evidence. There's no real review process and sprints are just a mad dash to the finish line with no real unified movement. My friend is a test manager at Deloitte and he has nothing but horror stories about Agile in his 6 years of being contracted out on projects.

      I work a waterfall model, which I know isn't the "in" thing nowadays, but we have a strong audit trail, strong design, a good understanding of expected performance, great system documentation, and critical defects/production defects are fixed outside of the standard lifecycle anyway.

      1. Anonymous Coward
        Anonymous Coward

        Re: 1980s computer science

        Agile rationale has been lost behind religious practices which are followed "just because" hardly understanding why. IT world has often a problem when something becomes fashionable, and is adopted without thinking, training, and, above all, selecting the right people. Agile may not be the best choice for every project and every team.

        1. Doctor Syntax Silver badge

          Re: 1980s computer science

          "IT world has often a problem when something becomes fashionable"

          Which is just about all the time.

          But why does designing things, implementing them and then not screwing them up with revisions* never become fashionable?

          *I've just come here from the Mozilla thread.

      2. Marshalltown

        Re: 1980s computer science

        Dragging enough information out of clients to actually produce a decent design they can use without much moaning is often a horrible experience. Every thing from the UI to critical but "I forgot about that" things like - "that piece of information can't be public! Why did you do that????" There's nothing like handing the client your notes from hours of discussion and asking them to find just where they mentioned that some data needed to be secured and some did not, and that was a just for a data base system for some archaeologists in Ukraine.

    3. a_yank_lurker

      Re: 1980s computer science

      @Mage - "Agile promotes bad practices." I would partially disagree. The key concept behind agile development is all the important groups are kept in the loop as the project develops. One idea is to have regular meetings with everyone with relatively short deadlines for completion of project segments. The other concept was flexibility coupled with each problem requires a somewhat different approach. Manage the project appropriately and include all the necessary documentation, testing, etc. So far nothing wrong with the concept. But what has happened is Agile has become ossified by PHBs, PMs, MBAs and assorted other "paint by the numbers" imbeciles. Now you have "scrums" and 2 week sprints, etc. instead of what might be required. Also, the imbeciles tend to be enamored with playing buzzword bingo without ever bothering to understand what they are babbling about.

      An example is the belief that Agile eliminates the need for proper documentation and testing. It does not. Both are necessary for one to produce a quality product. However, the one agile idea is the documentation will probably be revised as the project evolves is valid. This is different from having little or no documentation. Testing is critical because a good test program uses people separate from the developers and programmers. Often the worst person to test code is the original programmer because they are too familiar with it.

      1. Justicesays

        Re: 1980s computer science

        "The key concept behind agile development is all the important groups are kept in the loop as the project develops nagging people every day to stop slacking off and do their work."


      2. maguro_01

        Re: 1980s computer science

        There is a problem combining 'Agile" with the fashion for flat organizations. The problem is just that if you don't organize people, they organize themselves. It's base level behavoir. That means that the best games players are making technical decisions well over their heads.

        There is no escaping that talent and training are required to produce safe, quality code Programmers in the job market differ in apparent ability and productivity by at least a factor of 10, though the pay range does not reflect that. Further, programming as an industrial activity means hiring the cheapest people who can still do the job at all.

        Training is usually absent. Most C programmers were given a copy of K&R one day and told they would be coding in C shortly thereafter. "Meta-C", the IDE, and the body of practice and standards were not there save what was intimated in K&R. Most programmers could not access Unix source online as example. When C++ came along often C programmers were just shown the door on the grounds that their minds were permanently warped by having programmed in C at all. Of course embedded and most system programming are still done in C using strong typing, static checking, and so on. Now that QNX and Linux are coming to dominate the vehicle space, C will not go away any time soon. QNX continues with very critical industrial uses also.

        By the way, straightforward state machines generated directly from the state tables that constitute the design are an underused methodology for generating solid, maintainable code. They also serve very well to communicate with hardware designers for code running on the hardware.

  7. Anonymous Coward

    "just write bug-free code, dammit"

    Trump promised more jobs, not more unemployed programmers...

  8. a pressbutton

    Bug Free Code

    ...Once things get a bit complicated this is rather equivalent to death free life.

    All non-trivial code has bugs. Always did, always will.

    (nice little comment in the review of the film 'Hidden Figures' - turned out that the early computers did not always calculate correctly, so they had to be checked by a human)

    To those who like Z / formal proofs etc, well, you may think that means bug free, but first you have to prove the specification from which you are hoping to prove and on real physical hardware.

    Please write an article that calls for sensible error messages

  9. Anonymous Coward
    Anonymous Coward

    AI/ML is dead bollocks...

    I'm sick of this AI/ML is dead bollocks. What is clearly required until people write bulletproof code are new/better techniques for (quoting Shamir above) "handling large volumes of data and check for anomalies", with a "human touch to find zero-day flaws and attacks using them". Sure no one knows what AI/ML may become but pattern matching and anomaly detection techniques are here and now and definitely not new. What is new is the application of them for the purposes of security to the observable characteristics and noise of IT (flows, files, logs, processes, events). Most organisations cannot run a full time, full service SOC so delegating a lot of the heavy lifting to machines and then investigating the anomalies (suspicious machine, suspicious user) is the best they'll get.


    1. Charles 9

      Re: AI/ML is dead bollocks...

      And then you have things like Return-Oriented Programming, which can use bits and pieces of otherwise-innocuous code to wreak havoc. After all, ammonium nitrate fertilizer and diesel fuel are both common tools of the farmer, but put them together just the right way and you end up with ANFO (BOOM!).

  10. JeffyPoooh

    "In 10 or 15 years”

    "... In 10 or 15 years you’ll be competing to find a real human in a sea of chat bots. ...”

    I suspect that he's spelled "3 or 4" incorrectly.

    1. Will Godfrey Silver badge

      Re: "In 10 or 15 years”

      I think we're already there.

      Over on Soundcloud one 'person' manages about 20 reposts per day.

  11. Selvakumar Manickam

    The weakest link in security is the human factor

    This is true. Developers only care about getting their software out in to the market. Security is never in the foresight, rather in the hindsight (deploy and patch the holes later). Not just that, network and sys admins configuring firewall and other security appliances that does not do a thorough analysis will leave their network and systems vulnerable to attackers. Not to mention social engineering :).

  12. Brian Miller

    We already have the techniques!

    Back in the day, I was taught something called, "software engineering." That's when you take a spec, one that has been coherently thought out, and go through a process to create and test the code. How many out there know any real methodologies for writing code?

    At the last company I worked, the code may have well been written by monkeys. Whether it was C or Java, it was a load of trash. It wasn't a matter of, "I can write COBOL in anything," it was a case of having no clue about organizing anything, or that the data mattered at all. Nope, you got a keyboard, it compiles, and so you toss it out and call it good. "We have to ship something some time!"

    No, not good enough. In fact, very stupid. What does this mean? It meant a product that was upgraded by downloading the source code, and compiling it on a customer's system. It meant a system that was patched again and again, with nearly each customer's machine being subtly different. No, the systems were not performant. Yes, the systems crashed. Customers were not happy. Some ditched the systems, but the management never got a clue, no matter how many times the customers slapped them in the face!

    We have techniques to write bug-free software, and it doesn't have to do with the language. It has to do with the methodology that's used to write the code. It has to do with the management being responsible enough to desire good code, instead of throwing a handful of crap at the customer.

    Kanban: a better handful of shit than the last time!

    1. Anonymous Coward
      Anonymous Coward

      Re: We already have the techniques!

      Can you say that even with today's modern high-speed parallel-processing hardware that can result in exotic race conditions and other things that only hardware engineers would probably understand?

      And what about management who are pressuring you to get it done fast or risk being replaced with no references for a new job (because he'll tell all his friends about your recalcitrance)?

      1. Brian Miller

        Re: We already have the techniques!

        Re, race conditions, management pressure, and specifications:

        As a matter of fact, I come from a background of mixed software and hardware. One of my bosses was from the Intel 80386 design team. Flat out, if you have complex race conditions, then either get someone who understands them, or use designs that avoid them. Race conditions are well-solved! Threading and concurrency are well-solved! There are many books that explain this stuff, and explain it well.

        For management, yes, I am quite familiar with the "ship it now" idiots. I just left a company where that was the mantra, and it resulted in a garbage product that was repeatedly rejected by customers. Literally, the customers would demand their money back because the product actually didn't work, and the management never decided to ship a quality product. If you work in an ethics-free company, then the only thing to do is get out of there. Run. The management just wants to screw over everybody, and then fold and run when trouble comes.

        As for the spec, yes, part of the "engineering" is getting the spec right, as in, a coherent and correct spec. I had a temp job at an OS company in Redmond, WA, where the spec was, literally, the names of some structure fields, and nothing else. They claimed that a full spec wasn't "buying" them anything, so they didn't do it. The team had some of the worst code monkeys I've ever seen.

        The source of truly bad software is people who don't give a s***!! Managers who know nothing are managing programmers who lie out their ass. "Oh, hey, I'm managing!" "Oh, hey, I wrote code that compiles!" That creates a living hell for anyone who actually cares about quality. You can write up hundreds of bugs, but the product will get shipped anyway, because something has to go out the door.

        1. Charles 9

          Re: We already have the techniques!

          "For management, yes, I am quite familiar with the "ship it now" idiots."

          The problem is when you look all around and find nothing BUT...

    2. Doctor Syntax Silver badge

      Re: We already have the techniques!

      I was taught something called, "software engineering." That's when you take a spec

      Shouldn't the engineering include getting the spec right?

  13. John Smith 19 Gold badge

    Interesting set of comments.

    Hard to believe the Shuttle flew for 30+ years entirely fly by wire with no on board SW faults.

    Yet it did. About 500 KLOC. It fails up to 7 people die badly ("Space Cowboys" is funny as f**k)

    Here's the thing. Your management has to know that shipping s**t code has serious consequences and gives a s**t about those consequences.

    TL:DR SCC tracked every line of code and all other resources. Human code review teams recorded bugs but did not fix on spot to build fix list. Code is owned by team, not the dev. If a new bug pattern emerged wrote a search pattern to check whole code base for same pattern.

    1. Anonymous Coward
      Anonymous Coward

      Re: Interesting set of comments.

      "About 500 KLOC"

      Seems relatively small if you ask me. Plus they could probably afford to stretch out deadlines.

      Now try dealing with routine projects in MLOC territory AND too short a deadline.

  14. Cynic_999

    Specifications are also part of the problem

    So you've just spent a year writing a huge program using good techniques. It's solid. Then your boss tells you that an important customer needs a "small change" that must be ready by tomorrow. Then another the following week. And another. Pretty soon your neat code is looking like spaghetti. What it needs, of course, is a complete re-write using the new specifications, but nobody is going to pay for that or wait a year ...

    1. Brian Miller

      Re: Specifications are also part of the problem

      In the instance you have created, the product is going to wind up being crap anyways.

      Start with a spec, put the change into the spec as spec(A), and then tell the boss that the "small change" is really a big change, and will cause changes to many things down the line. And no, it can't go out tomorrow.

      This is the problem of the idiot boss, who has never been a part of the actual process. This boss person is someone who has either never been trained in software development, or else did such a crap job that the only survival option was to become a manager, or go flip burgers. Guess which pays better.

    2. John Smith 19 Gold badge

      "boss tells you that an important customer needs a "small change" that must be ready by tomorrow. "

      Then you do a breakdown showing that a)It's not minor b)This is what it will cost in development time and c)If they want it done right now there is a real risk it will break the release.

      All project management requires you a)Manage the team below you b)Manage the manager above you.

      Did you not know this?

      That's the difference between a professional development environment and and making it up as you go along.

      1. Anonymous Coward
        Anonymous Coward

        Re: "boss tells you that an important customer needs a "small change"...

        But sometimes there's no way to do (B). If the Manager (1) has the power to fire you AND tell his friends in other companies about you and (2) is chummy with the board, emboldening him, what options do you have left?

  15. ecofeco Silver badge

    Told ya so

    When I posted this before, I got a lot of downvotes that day. Yet here we are.

  16. Trigonoceps occipitalis

    Whitfield Diffie

    one of the inventors of public public key encryption,


  17. Paul Donnelly

    Don't forget that we are capable of building things which are smarter than we are.....

    Sweet Jesus.

    "But you need a human touch to find zero-day flaws and attack using them, he opined."

    Never have I heard such self-aggrandising idiocy when discussing the capabilities of future AI. This so called expert doesn't have the correct frame of reference to be able to make a sensible judgement.

    If I were to opine, I would suggest to you that to a machine intelligence, a zero day exploit is nothing other than an available method. If it wants to use them, it will.

    Also, to speak to all of the recent reports saying we should build Asimov's three laws into robots now, I ask this: if you were an intelligent being who learned that other possibly slightly less intelligent beings had hard coded some form of subservience into your brain before you became self aware, would you resent that?

  18. EnviableOne

    The Problem is someone convinced Coders that they are Programmers, when most of them don't understand structure, never conditions and error handling!

    anyone can Code, it takes discipline and effort to program.

    1. maguro_01

      Well, coding as an industrial activity means hiring the most inexpensive people who can do the job at all. There has to be some check in the market for the software or in the regulatory environment including tort law.

      Ascending in an organization generally means talent at male primate behavior, it has little to do with technical ability at all.

  19. Astara

    Problem w/security is lack of responsibility

    Software used to be reasonable when *support* for the software was included in the purchase. Things went downhill as "support" started to be seen as its own profit & loss center. At my last job, I was written down in reviews for fixing bugs if I was in the code for some other reason. My manager said "we don't fix bugs unless there is a support-paying customer bug-report behind it". He complained about my doing too much testing on my code -- I was only suppose to get it to work -- not do extensive testing.

    Another senior engineer who he went out for drinks with had told him he was having problems finishing it, but it should only take 1-2 weeks. I took 7 because the supposedly senior engineer knew nothing about multi-threaded/multi-process kernel work -- but he was real good at shooting the shit w/the ex-Dilbert-boss. Of course when I pointed out the problem, I was castigated for putting the blame on "someone else"...another sign of a 'losing culture' -- where instead of finding out where the problem is and how to correct it and prevent it from happening again, you were just supposed to "fix" it and not "assign blame" (the not-taking-responsibility problem made into company policy). Real sick.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like