back to article Don't blame Willy the Mailboy for software security flaws

There's a low rasp of a noise being made in the software world. Customers want software vendors to hold programmers responsible if they release code containing security flaws. Actually, that's not strictly true. Security vendors want customers to start wanting software vendors to hold the programmers responsible. As we …


This topic is closed for new posts.
  1. Pete 2 Silver badge

    Puts the kibosh on 3rd party libraries?

    So I develop some applications. Being an efficient kinda guy, I don't write all the software (down to the interrupt routines that handle the video controller) myself. I rely to a large extent either on code that already exists in the O/S, or code that exists in high-level libraries from thrid parties: possibly supplied with the development suite I use. Further, I don't inspect every piece of source from these libraries - or even just the version I developed on - for any weakeness. Where does that leave me in terms of being liable for any security holes? Maybe holes in the libraries, or because I use them in ways that are unorthodox (although still in compliance with the API)

    Unless you turn the whole world of software development into a chartered profession, with errrr, professionals who carry the whip (as opposed to their managers doing the lashing) and have the authority to demand certain development practices, this sort of professional liability can never be made to stick. Imagine the situation where a product design meeting takes place. The marketing peeps say they "need" a new product to compete with thier rivals. The softies say: "fine, but unless YOU accept liability for any and all security flaws, it will take 4 years to develop and cost 8 times as much, in professional fees, accredited methodologies, external audits, testing, re-testing and final certification." Under these circumstances, can you see any new software ever being developed - or will we have time-travelled back to the 60's where software was held in awe - and not just because of its price?

    1. heyrick Silver badge

      Not to mention...

      ...who gets kicked due to compiler bugs, OS bugs, hardware bugs, unaccounted glitches, and all the other quirks of nature that crop up.

      But yes, library code is a big 'un. Several moons ago I recompiled several open source programs to a newer call format for more recent hardware. Oh the chaos. Somewhere along the way, a number of bugs were fixed, which totally broke programs coded to rely/work around said bugs. Not to mention new bugs introduced. Oh, and seemingly arbitrary changes to what is exported in public scope and what isn't... Gah. I didn't have the time or inclination to sort it out so "some features don't work". It was a toss up between "attempt to fix a library with a fixation for single character variables (i.e. "r.w.c[4] = whatever") or, the most likely case, rip it all out and write my own replacement. Which, you know, will probably introduce a whole different set of quirks.


      I remember looking to update an email proggy. When I realised the code was copying items from the email header to a byte array with 40 byte offsets, I abandoned the project. Hell, I cobbled in some code to dump the thing to the screen every time it was touched, and most stuff ended up overwriting other stuff (look at your email headers, see what's over 40 bytes). After the assignment was a loop to poke nulls into the data so C would recognise it as a set of strings. The last item? Subject. Subject longer than 40 bytes? Well... I didn't look. I knew what would happen and if the message header parsing was THAT screwed up, I didn't want to know about the rest. At least it explains the propensity for mangled email addresses in that new-fangled "name" <email> format... <sigh!>

  2. Anonymous Coward
    Anonymous Coward

    I rad it like this

    That the phrase is meant to apply when vendors contract out work to developers other than their inhouse team as much as to emphasise the role that developers have in creating or avoiding dodgy code.

    The mailboy link up is wrong though - outside of losing important pieces of paper in his rounds - th emailboy is a minor cog in the system. Developers are more important and have a greater effect for good or ill.

    consider house-building - in the UK there are builders and there are subcontractors (both might be one-man operations). If you have an extension built and the wall starts cracking you'd want to have a word with the builder you'd paid for it and at the same time hold some strong opinions on Dave the brickie who'd you been fuelling with tea and custard creams out of your own pocket.

  3. some vaguely opinionated bloke


    "Developer warrants that the software shall not contain any code that does not support a software requirement and weakens the security of the application..."

    So the developer warrants that:

    - all of the code is there only to support the specific [lol] requirements, and if there is any code which does not support a requirement, then that non-supporting code does not weaken the security of the application.

    Rather than:

    - whether or not any of the code actually fulfils the requirements, absolutely none of the code weakens the security of the application

    Two very different things...

    1. Oninoshiko

      the problem with your verbage...

      I would never warrent the 2nd, I might on the first (but you would have to pay me one hell of a lot to do it). The reason is that I have seen requirements handed to me on projects that introduce security problems. As a devoloper I can point out that is bad, I can reccomend it be removed, but I am not the one who can remove it.

      Knowingly not fullfilling the requirements is not an option.

    2. Ian Michael Gumby

      You have a couple of problems...

      The strongest language should be that the developer adheres to industry best practices.

      Then you'd have to show that if the programmer was sloppy and failed to adhere to 'best practices' the developer is on the hook. (And then you have to limit the damages to the amount of the contract...)

      Of course since many companies outsource to lowest cost centers, you end up getting what you pay for. ;-)

      And then the argument. If the developer is in India, is he measured on the same best practices you'd expect out of someone working in the US?

      The you also have to wonder if level of education matters? I mean unless the developer is a software engineer and is classically trained, are they more liable because they only have a 12 week developer's course at a local community college?

      All fun things that keep lawyers happy. :-)

      But hey! What do I know?


  4. Si 1

    It's been a while since I used Java...

    ... but does it seriously compile the variable names into its byte code? That sounds unlikely to me, it might be a virtual machine but it's still a form of machine code. Plus, why would it keep variable names but not function names?

    Seems more likely you'd want to do something about SOAP calls or SQL that might be embedded in your code (although you probably shouldn't be doing that anyway). If you really are worried about hackers reading the data in the String table, why not overwrite it with jibberish after you've finished with it?

    Does this mean going back to the bad old days of naming all your variables A, B or C and creating unmaintainable code that no-one save the author can read?

    1. pitagora

      The title is required, and must contain letters and/or digits.

      it's the other way around: internal variables in functions are not kept, but function names are. All class and member names are kept in the byte code. This byte code can be fully decompiled to a working java program.

      As for you suggestion of naming variables A, B and C: that's called obfuscating and it's a very common practice, except it's done directly on the binaries using some expensive software :) Basicly that software will rename all your classes and methods to things like A.A.A and A.A.B and overload methods to the absurd. You would end up with 20 unrelated methods A in class A. Good luck to anybody trying to reverse engineer it.

  5. Eponymous Cowherd

    You get what you pay for.

    Quotation for developing the specified "Hello World" application to the required "Application Security Procurement Contract"

    Specification: £0:10

    Design: £0:10

    Coding: £0:10

    Testing and validation: £250,000

  6. Phil A

    Developers as a personal adjective or a generic

    While of course "developer" can refer to an individual, it would quite regularly be read as "the company who is the developer of the software" which changes the argument somewhat?

    1. Ken Hagan Gold badge

      Who is the "Developer"?

      Ah no. The text that the article links to is quite careful to use "Vendor" in 99% of its text. The quote is the one place in the contract where they deviate from this. (The preceeding paragraph is another, but refers to a different role.)

      The intent is quite clear: the individual devs must promise to be good. Whether it is enforceable is another matter, except in extremely obvious cases. Courts might well take the view that developers didn't sign the contract (or had little option, being cajoled into it by a Vendor looking for a sale) or signed it in good faith and shouldn't be held liable just because several years later someone finds an exploit, or received nothing in return for their liability so don't owe anything in the event of a breach.

  7. Brett Brennan 1

    Certainly higher up the food chain than the programmer

    To this day I never cease to be amazed at the number of companies I consult with that have no credible methodology for managing the running of their organization, far less managing their development process. Total lack of actionable requirements - both in the software development process - AND, more importantly - in the development of core business process and practice.

    It's not the "little guys" as you say; indeed, in most organizations I visit, if a programmer or QA person raises the point that the design is untenable, they are usually frog-marched from the building in a matter of days or weeks.

    Certification is absolutely no use what so ever: if the business users can't define what needs to be done in the first place, and complain about "unnecessary time spent" on things like requirements testing or design reviews, all the education in the world isn't worth dingo's kidneys when faced with time and budget deadlines.

    Most companies would rather hope against having an issue than try to prevent one. Even standards like PCI are met only to the level that the audit requires - and problems are patched (if that) rather than solved.

    The "Aurora" attack that hit Google and so many other high-profile companies SHOULD have been a wake-up call: if these companies all have common failings in their management practices that allowed an attack to be broadly pervasive, it's not the TECHNOLOGY that's the problem. Until that issue is addressed, software security flaws are the LEAST of the worries.

  8. Anonymous Coward
    Thumb Up

    no released code is ever completely bug-free

    it's all about minimizing risk

    of course, if it has been unit tested, system tested, integration tested and user-acceptance tested, then signed off by the top brass and there are still critical bugs, it's those who sign it off at the top who must carry the can...

  9. Rune Moberg

    I know who to blame

    Today's problem is that most security idi...excuse me: *experts* have decided it is a good idea to restrict communication to port 443 (and port 80 at a stretch).

    This resulted in all sorts of apps starting to use port 443 and 80 for non-http traffic.

    The response was to start scanning at least port 80 (and I'm sure some have opted to fiddle with certificates to facilitate some 443 scanning too) severly compromising non-http transportations.

    As a result, developers when designing a classical client-server application, are forced into using webservices for everything. The http protocol overhead eats bandwidth and is strictly a pull protocol. The server cannot tell the client anything, the client has to ask. Repeatedly.

    So now a few hundred clients take their toll on one server, this translates into a need for more servers. Then the developers are faced with a new choice: Do they really need to maintain state on the server? If so, they will need an extra solution for that, so that a client that suddenly request something from server #2 will have the same state.

    I could go on and on like this. In many cases, we developers use the wrong tools, and the reason is that some security idiot have decided that the old ways are unsafe.

    Well... Guess what... --it happens. Only now we spend more time and resources solving problems that really needed no solving. We could have used this time to make our solutions more secure, but those precious hours are spent making the unscalable scalable. And we're told to use development tools like Java, because that apparently helps us write safer code... (code that is easier to reverse engineer, which now seems to have prompted a requirement for writing more convuluted code which I'm sure will be a delight to maintain...)

    1. Anonymous Coward

      standard ports

      Back to school for you Rune.

      1) things like firewalls that block traffic on specific ports are necessary to prevent the far too common situations of people subverting systems listening on network ports. This isn't over the top security, it is security 101 and one of many security structures that are necessary to protect against developer stupidity.

      2) http is far from the only show in town for client/server protocols but is useful as it is well defined and applies restrictions on content and messaging that make them easier to audit and control

      3) http is stateless and largely client drive (but not entirely). That is not a fault of security, or of the way http was designed but the way it is MISused by developers.

      4) if you use a stateless protocol like HTTP you need to manage state elsewhere. Shocking!

      5) some developers fight the constraints that are put upon them because they cannot be trusted to do thing securely themselves. It never ends well for them when the auditors come around or there is some major outage due to actively bypassing legitimate controls.

  10. Anonymous Coward
    Anonymous Coward

    Security by obscurity

    claiming that calling the password variable password reduces security is just daft. Security through obscurity is the oldest security stupidity in the book.

    As to the string cache, well a sensible programmer would over write the data as soon as they no longer have any need for it.

    While sloppy programming is at the root of most security wholes companies should be asking why their developers write sloppy code. Their are probably 3 reasons they'd find.

    1) Because they can. The coding procedures are not strong enough. For a while I worked as a developer for one of the leading IT companies. When I worked there we had weekly code audits. You went through random samples of other people code with a fine toothed comb. Then you interrogated the programmer about it. After someone's second audit you noticed a dramatic difference in the quality of their output. Since then, this groups got taken over by a commercially much more successful organization bought by the parent company. Code reviews and most of the rest of the quality process was thrown out as being too expensive.

    2) Because you make your codes write crap code. Productivity goals don't include quality.

    "Never mind the quality - count the lines". Most programmers would actually like to write good high performance code. Often they're geeks their reward is in doing just that.

    Only then to you get to

    3) You employ crap programmers. Well this is partly if you pay peanuts you get monkeys. Cheap inexperienced programmers are often preferred to experienced old hands.

    But of course management is all about the art of not taking responsibility for any of the problems.

    Quick pass that buck

  11. Anonymous Coward

    I see...

    I've been writing software for longer than I care to think about. I think I'm pretty good. I SHOULD be "average", but I'm not - I'm pretty good. And the reason for this is that so many of my peers are so bloody bad. I have only worked with one or two other software engineers over the last 5 years that I actually rank and would completely trust. It is a very sad state of affairs, and admittedly probably says as much about me as it does the state of the software engineering industry. These "bad" softies do indeed produce some utter rubbish sometimes. They're not stupid people though. They're quite often sloppy and lack an attention to detail, but they're not thick. Lack of experience is usually the problem, and that's not their fault (but not always - some people never learn).

    Having said this, "bad" software is often not created by the engineers. One is often required to release code that one KNOWS isn't quite right. Commercial pressures, stupidly short time scales, and a general lack of appreciation and understanding by project management are usually to culprits. I don't defend releasing buggy software - to do so would be just another example of the "I was just following orders" excuse. But that is commercial reality, and I do make a point these days of refusing to release software that I know isn't right; a stand that has caused me some grief over the years.

    So, blame the project management? Well, they too are often under huge pressure by higher management to deliver. As an example, some years ago now, I (along with lots of other engineers) were working on quite a complex system. We spent ages doing all sorts of design and analysis, and URL stuff etc etc - ie - we tried to do things the "proper" way. One morning, completely out of the blue, the project manager came back from some meeting he'd just had with upper management and declared that "they" didn't want fancy URL diagrams - they wanted code. They wanted something to work. So we (literally) abandoned all the design work there and then and after a cup of tea, started hammering out code. In the end, the system was still pretty good I think (I left shortly before it was fully complete), but had it failed, had it been full of bugs, who would be to blame?

  12. Pandy06269

    Who else is responsible?

    I'm a software developer myself, and I completely agree with this movement. If I bought a car that had a safety issue (think: Toyota) they're not going to blame me, are they? They'll recall it and fix the issue.

    Granted, if my car got stolen I couldn't hold the manufacturer responsible if I left the doors unlocked. But if I did lock the car, and the locks didn't work because of a manufacturing fault, then yes, they're responsible.

    It's no different for developers. If the users made the software insecure (e.g. by using a crap password) then that's not the developer's fault. But if the users are using strong passwords and following all other "good security practice" but information still leaks because of a flaw in the code, then that's the developer's fault.

    1. Eponymous Cowherd
      Thumb Down

      Extending the analogy.

      A car, like an application, is a composite article. A large proportion of the vehicle is of 3rd party origin. If the car breaks down because its 3rd party fuel injection system fails, or crashes because its 3rd party braking system fails should the person who bolted it into the vehicle be blamed? If the security problem is caused by a 3rd party encryption library is that the programmer's fault?

      A car doesn't operate in isolation. If it gets trashed by an 18 wheel "semi", is that the fault of the car or the people that built it? If an application is compromised by another application running on the same computer, its that the fault of the programmer?

      If the car breaks because it hits a large pot-hole (like that's going to happen in the UK) is that the fault of the bloke who built it? If an application's security is compromised by an unknown flaw in the underlying operating system, is that the fault of the developer?

      Developing secure software is not a major issue. Ensuring it remains secure under all circumstances is a huge problem, and costs a *lot* of time and, therefore, money.

  13. blackworx

    No responsibility

    Without autonomy

    ...or, failing that, a fuckton of cash.

  14. frymaster

    Missing the point?

    "Developer warrants that the software shall not contain any code that does not support a software requirement and weakens the security of the application..."

    Isn't that just saying "developer hasn't stuck in any easter eggs or back doors"? It says the software can't contain any code that DOES NOT SUPPORT A REQUIREMENT and weakens security. In your example, there's presumably a need to verify the password, so even if that code is going to bring down Western Civilisation With its Use of the String Type(!) it's supporting a software requirement, so that's OK.

    This is why MS policy is now "no easter eggs" (they were famous for them back in the day)

  15. frank ly
    Thumb Down

    Analyse This

    "...the software shall not contain any code that does not support a software requirement AND weakens the security of the application..."

    I think the 'AND' should be 'OR'. As originally written, a 'coder' could put an easter egg in the application as long as it didn't weaken the security.

    The sentence would also benefit from a few brackets to make things clearer. If they can't get the basic requirement written properly then what hope is there?

  16. John Smith 19 Gold badge

    IIRC Harlan Mills @ IBM Federal Systems* put it this way

    "Software development is far too important to be left in the hands of people who like coding"

    *Now part of Loral.

  17. AAWW

    Obfuscating variable names...

    ... is a unbelievably terrible idea. It hampers maintainability and provides absolutely no security benefit, as anyone who has ever reverse-engineered symbol-stripped code can tell you.

    1. Solomon Grundy

      You Said It!

      Obfuscating variables for "security" is one one of the most impractical development concepts I've ever heard of. It not only slows down updates/upgrades, it is likely to cause more security issues and/or critical errors as the developer gets confused trying to decipher the bullshit code.

    2. TeeCee Gold badge


      Some years back I had the dubious pleasure of fixing some POS written by a helpful chap who'd called the first variable he'd used "ONE", the second "TWO", etc.

      If that sort of thing's supposed to be the way forward I think I'll give this shit up and take up something easy and safe. Like naked alligator wrestling.

  18. Robert Carnegie Silver badge

    That clause sounds to me like -

    "Developer warrants that the software shall not contain any code that does not support a software requirement and weakens the security of the application"

    Surely that only means no intentional backdoor access routines (duh). Like, when I use MY ATM card, it isn't deducted from my bank balance. The programming that I did to make that happen would be in breach of the contract term.

    If I followed the latest Firefox vulnerability and update story (before the general hackfest just opened), tey included new third-party code - new to them - something called WOFF, that DID support the software requirement but ALSO weakened the security of the application.

    Hmm... what if the SQL-type ATM code should say "IF ( @bankcardid IS NOT NULL ) EXEC DebitCustomerAccount" but my version says "IF ( @bankcardid <> 31415926 ) ..." ? That supports the software requirement for nearly every cardholder, only not me. That's five nines good, isn't it? ;-)

  19. The old man from scene 24

    Not to mention

    Another problem is that there's an idea popularised by the more marketing oriented part of the software engineering sector that computers can do anything. They're magic! Whereas we engineers know that security and usability are generally incompatible. When you get down to it, most customers don't want want a secure system, they want a usable system. In my experience, software engineers who plump for good design over usability are shouted down.

  20. Destroy All Monsters Silver badge
    Dead Vulture

    Wait what??

    >> [ ] Developer warrants that the software shall not contain any code that does not support a software requirement and weakens the security of the application...

    I am absolutely certain that the above clause, taken form the SANS page, is about the "Vendor" as opposed to The Intern that was just hired to do Access Control List implementation. But then again...

    Well, great. You might as well sign that the statement that every single line of your application is sure to not be interpreted by some Rabbi as being against the scripture in his cherished 12-volume Talmud. This includes the comments of course.

    NATURALLY there will be code that does not support a software requirement; there MUST BE. Helper code, debugging code, code to make things efficient, you name it. And whether anything weakens the security of your application is of course decided WITH 20/20 HINDSIGHT.

    Now, I know of projects that DO SIGN contracts with talmudic requirements and in which the vendor asserts being in FULL COMPLIANCE with said requirement. But that doesn't help at all. It's just that the call to the insurance company is getting more complicated and the rates go up.

    >> In other words, when it comes to application security and QA, the buck stops with the developer.

    Really, Matt? How can that happen? Apart from the fact that the "Aynan Shoulder Shrugh" is not apposite at all to a "passing the buck situation" (but what do I know), the buck just WON'T stop with the developer because the customer won't stand for it. I imagine the meeting in which 10 lawyers and representatives of companies A and B meet, with Willy the developer in the middle and not a single dude from the "SOFTWARE QUALITY ASSURANCE" floor in sight. That would be ridiculous.

    Noteh that the SWEBOK is NOT a "rigor-oriented qualifications for coders". SWEBOK is the "Software Engineering Body of Knowledge" - it is NOT about coding, it is about Software Engineering. How to manage the software lifecycle, how and which documents to generate when and all that jazz. Willy the Developer won't profit from it ever, he is supposed to read the "JUnit in Action" books and apply the knowledge with friendly coaching by co-responsible team.

  21. Peter Kay

    'Developer' is referring to the company, not a coder

    This is a storm in a teacup. This is only an outline for a contract between companies, not for purchasing a copy of Windows. As such, it's no different from other contracts : the more you ask for, the greater the cost and the longer it takes.

    I suspect the sentence should probably actually read 'The company supplying the application to the customer should not include code that is unnecessary to meet the requirements of the customer and weakens the security of the application'. The emphasis would be

    even clearer if the author revealed it's contained under the title 'No Malicious Code'.

    So, no, it's not putting blame on the individual coder.

    However, there are possibly grey areas. If you ship a website to the customer, you shouldn't also be including Bit_Of_PHP_From_Another_Project_That_Shouldnt_Be_Here.PHP

    There's also libraries. Unused functions in linked libraries may, or may not, be included nontheless, and they probably will be in the case of DLLs (function stripping is rare these days, SFAIAA). That's over to the lawyers again to argue if it is reasonable, or indeed if there is a difference between including unused functions which are known to be insecure, and unused functions that are later found to be insecure.

    If the customer insists, I'm sure there will be a suitably large charge for stripping out all unused functions, and the trouble incurred in potentially maintaining a different source code branch..

  22. James 47

    And there we have it...

    The end of the Indian outsourcing economy!

    1. John Smith 19 Gold badge

      @James 47

      "The end of the Indian outsourcing economy!"

      You say that like it's a bad thing.

  23. JP19

    Read the whole line

    It says:

    No Malicious Code

    Developer warrants that the software shall not contain any code that does not support a software requirement and weakens the security of the application, including computer viruses, worms, time bombs, back doors, Trojan horses, Easter eggs, and all other forms of malicious code.

    Better luck next time with the sensationalism.

  24. Anonymous Coward
    Anonymous Coward

    being a developer

    is already an expensive exercise in arse covering all around. This sort of thing will make it even worse. I already seem to spend most of my life chasing, and being chased, for signoffs to documentation that i can wave incase anything ever tries to land on me. It's now a case of "It doesn't matter if it doesn't work, or if it's insecure, we built exactly what you asked us to build!"

    If your customers have signed off your designs, specs, testing, done their own testing and accepted the software, then some of the responsibility has to land on them if it ends up not working*.

    *by not working, i mean not matching the customer's requirements. if it does fit all the requirements then it works perfectly, however much common sense might suggest otherwise.

  25. Yet Another Anonymous coward Silver badge

    I'm all for it

    As soon as I can get the same assurance from the maker of the OS, the drivers, the dev tools and all the libraries I use.

  26. Rune Moberg

    @AC "standard ports"

    You missed my point by about a light year.

    1) True, you should block specific ports. However, today most organisations block ALL ports, which... leads to your third point "but the way it is MISused by developers.". Well, *duh*! If your network is only open for http, then that is what most application developers will use! As a developer, you cannot use a nice and shiny peer-to-peer protocol when only http is allowed.

    2, 4 and 5) I know. I've written several http servers from scratch myself.

    Re-read my first post again. And THINK. If the security guys gives you peanuts, then what will you eat?

    Do you want an application that is properly designed, or one that will actually work in most of the paranoid networks out there? My personal preference won't put much food on the table, so I close my eyes and use the tools at hand. From what I can tell, that is what everyone else is doing too.

  27. Eponymous Cowherd

    Only the lawyers will win.

    Say a programme I wrote goes tits up and costs company X £10,000.

    Company X sues me and wins. I pay £10000 + £2000 costs

    The problem was actually caused by a 3rd party encryption library. I sue 3rd party dev for £50,000 to cover lost revenue due to the bad rap the previous case has given the product.

    I win and the 3rd party dev has to cough up £50,000 + £5000 costs.

    He discovers the problem was down to a bug in a 3rd party library *he* was using.......

    And so it goes on, and the lawyers get richer.

  28. Giles Jones Gold badge


    People want features, even if they're not all that useful. But the more features, the more code there is and that's where problems occur. Features take time to test, features take time to abuse and see if they create security problems.

    Microsoft likes to boast about lines of code in Windows, it is probably the worst metric to quote when it comes to software.

  29. Neil Cooper

    the problem is....

    After 30 years as a developer I know for a fact that programming is a complex creative process, and to do it well requires skill, discipline and experience. I also know that that nearly all managers at the level that sets corporate policy have 0 technical understanding and usually don't actually understand the tech in their own products. Even though they work in a high-tech industry they usually only have a business degree or MBA.

    Consequently most managers all make the same mistakes. Its like their MBA school removed their brain and replaced it with a single rule to try and reduce everything down to a single simple process then hire cheap labor to perform it. They are completely oblivious that programming is a skilled and creative art producing one-offs, not a production line banging out multiple exact copies of the same thing. They've only ever learnt one management approach and they're hell-bent to apply it, regardless of its fundamental unsuitability in this environment. When all you have is a hammer everything looks like a nail.

    Consequently because they dont understand programming they also have no clue about the skills it requries, so they repeatedly employ underskilled people to perform complex programming work because they are cheaper up front, regardless of the actual damage and extra rework these people cause. Management always totally ignore money they can't directly account for, so the damage one incompetent programmer can do to a code baseline is never an issue because its invisible to the bean-counters as they can't calculate it up-front.

    Unfortunately it seems nothing will ever change because the incompetent managers will never fire themselves. If anything ive noticed that over the 30 years I've been working as a developer managers are more blinkered than ever so the trend is getting worse.

    Beer because its the only way out for developers.

  30. Unlimited
    Dead Vulture


    "There's a low rasp of a noise being made in the software world..."

    " security certification vendor SANS"

    By going off on irrelevant programmer certification and management responsibility tangents, the article blows right past the very point that it started off with:

    "Security vendors want customers".

    A) Customers want their business requirements met in the cheapest, quickest way possible.

    B) The low/mid cost developers get the work.

    C) Security certification is not included in the winning bids.

    D) Security certification vendors are unhappy about this and want customers to stipulate security certification as a requirement in the tender.

    E) D is not happening because of A.

  31. Kurt Guntheroth
    Thumb Down

    pay us like doctors

    If developers are to be board-certified like doctors, and are to be liable for malpractice like doctors, then they will have to be educated like doctors (8 years of college, 4 years of supervised apprenticeship), and they will have to be paid like doctors. No outsourcing to india. No ragamuffin non-college-grads.

    Raise your hand if you're naive enough to think this is ever going to happen.

  32. Anonymous Coward
    Thumb Down

    Not always the developers fault

    I've been working on some code for the last 4 months now, and a couple of days ago the customer decided to let the PM know that they actually needed some other functionality because those that be have said so. It's supposed to be delivered to QA in a week or so.

    Ok, been here many times before, another week or two and I could get the new code integrated.

    Started working on it, and got a *very* basic version working as proof of concept for myself. The PM saw this today, started playing, then said "well it works pretty well, lets give them that. It's only a little bit that doesn't work".


    After much laughing on my part I let him know (politely, of course) that if he thought I was going to put my name anywhere near that code then he had another thing coming.

    Sometimes the developers just get pushed out, whilst the management decide to do their thing and *please* the customer. I was lucky (and feeling argumentative!)

  33. Anonymous Coward

    Developers, Managers, Capitalism

    Many developers know very well about the security issues and other kinds of bugs in their code. The only Problem is the Feature Treadmill: As soon as something is barely working, your manager tells you to "focus" on the next 25 features he urgently needs implemented.

    Good engineers want to deliver high quality, but for the vast majority of the software industry and users this is something of priority #5. Developers are normally on the lowest level of company hierarchy, so they just "get used" to the fact that quick'n'dirty work is expected from them. Managers are also under pressure to deliver a ton of features yesterday. That is the expectation of the market and so the customers get it. I have never, ever heard of BP telling MS it would dump Office if they did not step up their security. Or Daimler telling the same to SAP.

    All these "leading" software companies use highly failure-prone tools like C/C++, have dodgy "quality assurance" processes and are basically run on feature lists and deadlines.

    It is notable that some industries actually have quite good software development processes. Avionics and Medical Electronics, namely. Boeing simply can't afford a safety/security issue in their flight control systems, so they have the right tools (Ada) and good QA processes in place.

    As long as a company mandates the use of C/C++, it is absolutely uncomitted to security and first management must be rectified before you start talking to engineers. Even if management demanded some kind of "QA statement" from engineers this would be totally useless if the current tools and processes did not change.

    Honestly, we simply have to live with crappy IT security for the time being, as long as the CEO of Daimler does not kick the arse of the CEO of Microsoft. Let's hope tools like SE Linux, Virtual Machines and Firewalls will contain and detect malware. May the gods of SW be with us :-)

    1. James 47


      That's a fine display of fanny prattle, jlocke.

      C/C++ programs will do what you tell them to do. They're no different from any other language.

      Just because *you* cannot trust yourself to write secure C/C++ programs doesn't mean it's a bad language. I use it everyday and am quite confident that my binaries are no less secure than if I had written them in Java or Ada.

      It's like saying that F1 cars are crap because not everyone is skilled enough to drive them.

      Stick to what you know. If someone asks you a C/C++ program just refuse and stick to web pages.

  34. heyrick Silver badge

    I call bullshit on Fortify...

    The biggest change that has to happen is nothing to do with the developers. It is entirely management. It is those who want a profit-led product out the door before the competition. Who announce the release of a product long before R&D is complete, giving a release date somewhere BEFORE the projected end of R&D cycle. So before that is due to finish, somehow the product has to be sorted, put together, boxed, and shipped out the door. And then, when all hell breaks loose, they look for somebody ELSE to point the finger at.

    And just to show how far off the mark Fortify can be... We can't call a password variable "password"? WTF? Would you rather I called it 'q' ('p' would be too obvious). Shall my loops be 'i', 'j', and 'k' while we're at it? And what sorta COMPILED language retains variable names? Or, oh, wait, maybe their own project management was rushing them so much they never figured out how to turn off the embedded symbol table. Hey - guys - you do know all that debugger support rubbish isn't supposed to be included in a release project? :-)

    Coders have to share some of the blame for bad code. But not all of it. If your boss tells you 'x' has to be done and dusted Friday night when you expected NEXT Friday (if not later), and there's no option for argument... there's two choices. Quit, or hack together some code that "works" hoping it works well enough, while feeling disillusioned by the whole thing. And, well, you know, making it ACTUALLY work can be pushed off until the next (chargeable) upgrade... Friggin' management never miss a trick...

  35. adrian sietsma

    Enough C++ bashing, FFS

    I can write insecure code in _any_ language. And, jlocke, a load of avionics and medical code _is_ written in c/c++. As the original article pointed out, Java has it's own security issues as well, as do all languages. What language is SE linux written in again? And your firewall ? And your VM ?

  36. Anonymous Coward


    I do program C++ for a living and I like its efficiency, but I hate the security risks.

    Just a few examples:


    char buffer[6];

    int x;

    for(int i=0;i<=6;i++)buffer[i]=otherString[i];

    //have fun with x now


    struct x{int x, int y}

    struct y{int x, int y,char[6] z}

    x a;

    x b;

    y c;



    vector<int> v;


    v[77]=7;//no problem here, just somewhere else


    int i=77782;

    printf("%s",i); //print some nice garbage


    char* ptr;

    printf(ptr); //great fun probably


    char* ptr;



    char* ptr=new char[100];


    delete ptr;//have fun with your heap


    All sorts of totally undeterministic multithreading fun


    really, really slow programs after you instrument them with valgrind or Purify (to find the memory bugs)

    (1/100th of original speed)

  37. Anonymous Coward
    Thumb Down

    Real World Security Issues

    ...are 90% of time revolving around some stupid stuff like buffer/integer overflows or uninitialized pointers. Eliminating those issues simply by using a safe programming languages would kill off the vast majority of security issues. Companies sticking with C/C++ are simply not serious about security and holding developers responsible for those bugs is simply hilarious.

    If you have worked for nine hours you easily forget the terminating zero or type "<=" instead of "<". That's not neglicence, that is normaly human error, because humans are not machines.

This topic is closed for new posts.