back to article ‘What are the odds someone will find and exploit this?’ Nice one — you just released an insecure app

According to a recently published Osterman Research white paper, 81 per cent of developers admit to knowingly releasing vulnerable apps. If it were a single piece of research, we might have passed it by, but the 2021 Verizon Mobile Security Index reinforced the point by concluding that some 76 per cent of devs experienced …

  1. Jc (the real one)

    LinkedIn on my android device just wanted to add "let me know the phone number this device is currently talking to".... Time to start double checking and removing apps that are clearly asking for too much. If I loose "free apps", so be it

    Jc

  2. Anonymous Coward
    Anonymous Coward

    WhiteHat Security

    Sounds like WhiteHat Security aren't to be trusted, if that's how their VP of Strategy truly thinks.

    Total and complete disconnect from the world of the boots.

    1. steviebuk Silver badge

      Re: WhiteHat Security

      Also sounds like he talks business speak bollocks instead of just speaking in plain fucking English so we can all understand.

  3. McCovican

    > “I disagree that developers ‘knowingly’ release vulnerable applications,” Setu Kulkarni, VP of Strategy at WhiteHat Security, told The Register. He argued that if development teams knew better, they would take appropriate security measures.

    Dude, it's not really something you get to "disagree" with. It's a fact of reality. We *do* know better. But shitty project managers like you lead with their boots and their heads up their arses, and we're forced to release in a state we're not happy with.

    1. Anonymous Coward
      Anonymous Coward

      I am happy to release whatever, whenever: Bug-Bounty time and "management made me do it"! Not bad!!

  4. heyrick Silver badge

    aren’t fully confident that code isn’t free of vulns before going live in production

    If you're fully confident that your code has no vulnerabilities whatsoever, you've either performed a deep and expensive audit of your code and every library it calls...or you're delusional.

    We can try for best effort and not thinking that there are any vulnerabilities (management permitting), but that's absolutely not the same thing as "fully confident".

    1. Anonymous Coward
      Boffin

      Re: aren’t fully confident that code isn’t free of vulns before going live in production

      True. Except that if you perform an audit, no matter how deep and expensive, and think that your code now has no vulnerabilities, you're still delusional.

      1. Anonymous Coward
        Anonymous Coward

        Re: aren’t fully confident that code isn’t free of vulns before going live in production

        Oh? What happened to formal proofing?

        1. Tomato42

          Re: aren’t fully confident that code isn’t free of vulns before going live in production

          you can do formal proofs for single functions/methods, not for a whole application

          1. Charles 9

            Re: aren’t fully confident that code isn’t free of vulns before going live in production

            SeL4 is a whole dang OS, and it has a formal proof.

            1. Anonymous Coward
              Anonymous Coward

              Re: aren’t fully confident that code isn’t free of vulns before going live in production

              So where's the CPU with a formal proof to run it on?

              Last one I heard of someone had found a whole in their proof.

              1. SecretSonOfHG

                Re: aren’t fully confident that code isn’t free of vulns before going live in production

                Oh yes, and... where is the formal proof that proves that your formal proof is correct? And so on. I thought this debate was settledd a few decades ago... yet there is still people that think that something above the simplest of systems can be proved "secure"...

            2. Tomato42

              Re: aren’t fully confident that code isn’t free of vulns before going live in production

              Oh, I'd love to see the formal proofs of the device drivers. And I mean a real device, like a SAS HBA, not a RS232.

          2. Dave 15

            Re: aren’t fully confident that code isn’t free of vulns before going live in production

            And even then it gets tricky as soon as you use any library functions or are involved with classes. Even worse of course how many people actually check what the compiler outputs? After all your C++, C#, Java etc is NOT what is actually run on the machine. Amusingly even that isnt quite enough, chips themselves have mistakes often around cache so when you think you have proved something on one piece of silicon you havent from another. Now we are still at a level where such things dont matter as we are too sloppy even before that, mainly due to the incompetent buffoons that 'manage' these things. The same idiots that think it is going to be better if you have just fought your way through a 3 hour commute (to be with your colleagues), are worrying about your gas bill (because they dont pay enough) and are throwing up in the corner from the foul coffee or Jimbos daughters bout of sickness imported from school.

    2. Anonymous Coward
      Anonymous Coward

      Re: aren’t fully confident that code isn’t free of vulns before going live in production

      Not all vulns are equal, people don't want a clean pen test report for example as they wonder why they paid so much for them, when doing that sort of work I regularly had to make a big deal out of something trivial the server http header (indicates nginx, apache, iis etc) if there was nothing really wrong with code just to make the customer feel they got something from the report

    3. NightFox

      Re: aren’t fully confident that code isn’t free of vulns before going live in production

      I'd be more concerned about the 29% of CISOs who were confident their code wasn't vulnerable.

    4. Michael Wojcik Silver badge

      Re: aren’t fully confident that code isn’t free of vulns before going live in production

      Right. 81% of developers knew their applications had vulnerabilities. 19% didn't know that.

      I'd rather work with people in the former group.

  5. Potemkine! Silver badge

    how do we cure application vulnerability epidemic

    Replace all devices by paper + pen... and I'm not sure that will be sufficient.

    The most we can do is limiting vulnerabilities to the most. But eradicating all of them? Even the Matrix is reloaded from time to time.

    1. bombastic bob Silver badge
      Devil

      I can think of a few examples where shipping something "with a vulnerability" might be just fine. Here's one that I think epitomizes what I mean:

      Let's say you ship embedded devices with Raspberry Pi and RPi OS on them, running your software and a touch screen interface. This is pretty popular from my observation.

      You don't allow ssh or listening network services (except by using a special maintenance feature from the UI that end-users normally do not know about) but since the device DOES need to phone home to a server on the network, it has access to one. And the user/password is well known (say the default RPi user/password). because the screen logs in automatically on boot. Is THIS insecure? Or, because there are no listening services (and no way to add them via the UI) it's "secure enough" ?

      Then you think about whether the latest polkit zero-day is worth LOTS OF EFFORT to patch right now because oh-my-god-it-can-be-exploited, or whether it can wait for the next round of maintenance updates several months from now... because there are NO listening ports! And, you have thousnds of pre-imaged SD cards on order, and you're shipping right now, and you STILL have SD cards that have not yet been delivered (in the pipeline), and you JUST heard of this vulnerability LAST WEEK.

      and because of the nature of the devices, you don't want them using automatic package updates and ending up with the kinds of problems you see when a windows system failed to boot after doing this sort of thing... and the screen displays the error instead of the user interface. And so on. [controlled updates with a USB or download or by replacing the SD are acceptable because they've been tested]

      Anyway, that throws a wrench into the works ("shipping with a vulnerability"), but in the case I outline, the system config already mitigates the vulnerabilities you need to patch at some point. If you can't ssh in, and can only view a UI that consists of pages written by the manufacturer [no external links], and it's not listening to incoming network traffic, nearly every possible vulnerability has been "dealt with". And exploiting a zero day (like with polkit) would require physical access. Might as well take the SD card out and muck with it from a laptop.

      (yeah I'm kinda working on something similar to this)

      1. doublelayer Silver badge

        That really depends what the user or buyer is worried people might do. For example, if this is public facing, could a member of the public do something to interfere with it? For example, can a user connect a USB device or activate and thereby attack your management system? If they can, that's an attack surface you have to deal with.

        Then we get this: "And the user/password is well known (say the default RPi user/password). because the screen logs in automatically on boot."

        That's insecure. Here's why. First, don't have something log in automatically unless you need to. Have an account run the UI on boot but don't give that a logged-in desktop session if the user can manage to close your UI. If this is an appliance, you likely don't even need to give them any way out of it, so don't let a desktop environment circumvent that. Also, change the passwords. Yes, SSH is off for now, but you still have other security to worry about. If they get a login window somehow, you don't want anyone who guesses pi/raspberry to have root access. Similarly, disable the pi user's no-password sudo rights. This is a potential issue and you can fix it very quickly.

  6. Anonymous Coward
    Anonymous Coward

    This security feature is annoying, disable it

    As I just had a recent argument after correctly implementing authentication

    This is annoying, disable it.

    it is up to the latest standard

    Similar app X doesn't do it

    Similar app X isn't up to latest security standards then

    Similar app Y,Z,1,2 don't do it

    It doesn't look like any of those apps have been updated in years

    Customers won't like this very small inconvenience, remove it

    1. Anonymous Coward
      Anonymous Coward

      Re: This security feature is annoying, disable it

      That is a perfect example of where some legally enforceable industry standards could be useful. Hardly encompasses the whole spectrum of risk - but it's a start. Example - companies can get a certificate for some level of security, which makes them more attractive to customers, and enables a premium. But there are checklists they have to swear to follow, or be liable (civilly and possibly criminally) in case of failure.

      Civil engineers and architects already do that - lots of regulations.

      (Of course the software rules would be changing much faster and be so full of holes - it would be much much harder. )

      "Customers won't like this very small inconvenience, remove it"

      then show him the standards and he might change his tune.

      1. Charles 9

        Re: This security feature is annoying, disable it

        And if the guys up top simply decide to move offshore and NOT take you with them...?

        1. Tomato42

          Re: This security feature is annoying, disable it

          how do they plan to move their customers offshore?

          1. Charles 9

            Re: This security feature is annoying, disable it

            Shell companies. Let them take the fall...

      2. Frank Thynne

        Re: This security feature is annoying, disable it

        I've been looking for an opinion like this for years and have felt that I am a voice in a wilderness..

        Software Development should be recognised as an Engineering discipline like the others you mention. As you say, regulating software products will be difficult, partly because the horse has already bolted.

        But there are International standards of Quality Assurance and Certification that could and should be required by Governmental and similar Public bodies when buying, installing or upgrading software products. Those purchasers have sufficient weight to enforce the adoption of Good Practice.

        The problem at the moment is the complexity of large-scale products, especially Operating Systems upon which so much software relies. It will take many years before they can all be scrutinised and fully certified, but the current practice of developing new or changed features before correcting errors in products in current use makes the task more difficult.

        It's noteworthy that large software companies have not been granted or sought ISO certification of their activities, and implementing it will be costly. But continuing with current policy could lead to a collapse of a software provider and the possibly the World Economy.

    2. JBowler

      Re: This security feature is annoying, disable it

      It happens in FOSS too:

      https://github.com/espeak-ng/espeak-ng/pull/955

      Whatever.

  7. Anonymous Coward
    Anonymous Coward

    Shift left? shift right?

    It's not clearly explained what these terms are being used to mean. I sort of get a general impression from article context, but can anyone make it clearer?

    1. uqrxur

      Re: Shift left? shift right?

      It refers to the systems development lifecycle, which is often represented as a timeline that starts with inception or analysis, and ends with deployment or release.

      Many managers tend to postpone security efforts up towards the end of this lifecycle, where they typically hope to solve the security issue with just a cheap penetration test, or worse, with a magic bullet security scanning tool. (The situation is even worsening at the moment with many companies increasingly relying on bug bounties.) This is what is referred as "pushing right".

      On the other side you have some companies that try to address the security issue earlier in the cycle, sometimes even at the beginning of the lifecycle. This typically translates into identifying security requirements from the beginning of a project, identifying and addressing security or privacy threats directly during the design phase and setting up reliable tools/APIs/frameworks that prevent most vulnerabilities from even entering into the coding phase. All these are often referred to as "pushing left".

      Voila, hope it helps :)

      1. Notas Badoff

        Re: Shift left? shift right?

        Thank you! And you ↓ down there. This is a terrible article, written in embedded fog of one tag-ridden segment of industry and of writer full to the brim with now! and now! terms and now! assumptions. Which is strange, as we're talking about a timeless concern for software of any type.

        It did not help me when searching for "shift left" and "slide right" that everything came up as 'DevOps' and snake oil and pictures of cats and rugby. (some talk presenters just shouldn't)

        I was really wondering if all this was an in-joke re: Rocky Horror:

        It's just a jump to the left,

        and then a step to the right.

        Put your hands on your hips,

        and bring your knees in tight.

        But it's the pelvic thrust

        that really drives you insa-a-a-a-ne.

        Let's do the time warp again!

        ("It's so dreamy")

        1. bombastic bob Silver badge
          Coat

          Re: Shift left? shift right?

          Rocky Horror - and now you reminded me of the funniest thing the audience says:

          "Meatloaf again? We had Meatloaf LAST week!"

          (back in the 80's when midnight showings of RHPS were 'a thing')

      2. MachDiamond Silver badge

        Re: Shift left? shift right?

        "Voila, hope it helps :)"

        It would be easier to just say "proactive" or "reactive".

    2. Claptrap314 Silver badge

      Re: Shift left? shift right?

      These terms are relative to that straw man process about which the author said, "Such efforts are doomed to fail." Known as waterfall, it nevertheless took over industry until challenged by the Agile Manifesto twenty years ago. It's persistence in the industry is easiest to understand by recognizing that 1) it respects organizational structure and 2) it gives power to the highly-disrespected teams (security & operations, aka "organizational blockers" aka "the adults in the room") whose concerns nevertheless must be respected.

      In waterfall, the process starts at the left and ends at the right.

      1. Ken Hagan Gold badge

        Re: Shift left? shift right?

        Fun fact: Waterfall was "invented" by an academic paper that needed a term for the worst possible methodology. Naturally no such term existed because all methodologies that had actually been proposed, described or actually used had at least some redeeming features.

        It was a straw man on day one, has never been a thing, but is still The Reference against which everything is measured. This is probably because even snake oil looks good next to it.

      2. bombastic bob Silver badge
        Devil

        Re: Shift left? shift right?

        i actually thought of a project management tool, like the old "Microsoft Project" which may or may not still exist. Shift to the left, get it done earlier. To the right, it's delayed. Something like that.

        Managers like gantt charts, they can slide 'em around like those birthday party puzzles we played with as kids, and announce stuff in meetings (like new deadlines) based on their sliding things around.

    3. Anonymous Coward
      Anonymous Coward

      Re: Shift left? shift right?

      Rocky Horror?

      It's astounding

      Time is fleeting

      Madness takes its toll

      But listen closely

      Not for very much longer

      I've got to keep control

      I remember doing the Time Warp

      Drinking those moments when

      The blackness would hit me

      And the void would be calling

      Let's do the Time Warp again

      Let's do the Time Warp again

      It's just a jump to the left <------------------ shift left

      And then a step to the right <------------------ shift right

      With your hands on your hips

      You bring your knees in tight

      But it's the pelvic thrust

      That really drives you insane

      Let's do the Time Warp again

      1. Roland6 Silver badge

        Re: Shift left? shift right?

        It's just a jump to the left <------------------ shift left

        And then a step to the right <------------------ shift right

        I think (given the left to right convention used in describing the development processs/waterfall) the problem is more of a jump to the right and skipping steps, rather than having the analysts calling the tune....

      2. Anonymous Coward
        Anonymous Coward

        Re: Shift left? shift right?

        You guys have now put a Time Warp/ Development Conference crossover.

        I now need this to happen.

    4. IGotOut Silver badge

      Re: Shift left? shift right?

      It's means you get a full house in Buzzword Bingo.

  8. iron Silver badge
    Facepalm

    The problem is not developers knowingly releasing insecure apps, the problem is the management who make them do so.

    I used to work for a small oil & gas consultancy. They had all their data on a SQL Server directly connected to the Internet, no VPN, no API, just an open port on a non-standard port number. This was 10 years after CodeRed so the risks should have been obvious. I brought this and other security issues up regularly but management's answer was "why would anyone want to hack us?" Perhaps because you supply data and software to the all the major oil companies in Europe? Everyone from Greenpeace to Fancy Bear would have been interested ffs.

    The board only became interested in security when they sold the company. Then myself and the sysadmin (who was in his first job) had to secure everything asap. I left shortly afterwards, leaving a dev team who had no idea how the security I wrote worked and didn't care. I hear their stuff is all online now using React, I dread to think what kind of security swiss cheese that is.

    1. ShadowSystems

      At Iron, re: management.

      Exactly. Make management personally, criminally, financially liable for releasing known-buggy code & they'll take care of the problem so fast it will leave scorch marks. If the top is going to have their ass in the fire should the software cause scandal, you'd better believe that they'll refuse to release anything until it's been vetted up one side & down the other with an electron microscope fine toothed comb.

      1. Charles 9

        Re: At Iron, re: management.

        Unless they find it much easier to simply move offshore, add some scapegoats, or lobby to get the laws changes under threat of taking their tax revenues to a rival country...

        Think it through...

    2. Bruce Ordway

      why would anyone want to hack us

      Since there is an (almost) infinite number of monkeys are out there hitting keys.

      The question should probably be "how bad will it be after I've been hacked"?

  9. Eclectic Man Silver badge
    Coat

    81 per cent of developers

    "according to a recently published Osterman Research white paper, 81 per cent of developers admit to knowingly releasing vulnerable apps.

    And the other 19% don't know.

    Honestly, there is no such thing as totally secure computer code. With literally millions of instructions in serious applications there is no way anyone can guarantee that there is no exploitable vulnerability in delivered applications. In fact the unending series of software updates, fixes and patches would indicate otherwise.

    Managers may have to choose between deploying code with known insecurities or the project or company going bankrupt. Blaming the developers is unreasonable. You might just as well blame the clients for wanting their nice new application full of easy-to-use functionality with lots of power to do things.

    The only solution is not to insist on perfectly secure applications (as they would forever be 'in development'), but that they sit behind the most appropriate security for the data and users. Oh but then management would have to make some serious decisions about platform security issues and costs.

    1. Anonymous Coward
      Anonymous Coward

      Re: 81 per cent of developers

      > And the other 19% don't know.

      I remember when I worked at Oracle having to explain what a SQL injection attack was.

      I got "that's too hard for people to do!" until I pasted Bobby Tables into a form.

    2. MachDiamond Silver badge

      Re: 81 per cent of developers

      "The only solution is not to insist on perfectly secure applications (as they would forever be 'in development'),"

      Not really. There is a tendency to load applications with so many "features" that the code bloat causes it to float against the ceiling. Many companies see their competitors bolt on some functionality so they have to jump on the bandwagon with their own, never mind that only .0001% of users would even need it or need it more than once. Really big applications are onerous to use most of the time. A new computer is required with each new release of an office suite.

      A solution is for companies to develop applications that do a task well without making it all singing and all dancing. I do a lot of photography and use Lightroom and Photoshop to edit images. I do some work in Lightroom and there is a feature that lets me easily export it to Photoshop and reimports it back to Lightroom when I'm done. There are other third party applications that take advantage of the framework. I don't need Lightroom to do everything internally and it's better than it doesn't try. Not every image will be processed in Photoshop. Lightroom is the airport where all the trips originate and return.

  10. DS999 Silver badge

    If you've released an app of any size

    Either you have knowingly released an insecure app, or you vastly overrate your ability and unknowingly released an insecure app.

  11. Claptrap314 Silver badge
    Flame

    Blame the privates for losing the war

    There are so many layers to the problem that this "report" is either missing or deliberately ignoring that I have to question their competence in the subject.

    1) Developing bug-free code is d*** hard. As I continue to mention, just the claim that "this code does what it is supposed to do, and nothing else" is at least equivalent to a master's thesis most of the time. And very, very few people are capable of getting a master's in mathematics, let alone those who have one & are willing to work as programmers.

    But for an application to actually be secure, you need to secure not just the code, but the compiler. And the OS--including drivers and stacks. And the hardware. Thus, the processor, the management engine, the microcontrollers--inside the memory & drives. Recursively.

    If the rest of the report were honest, I would be willing to overlook that previous paragraph.

    2) The customer is king. There is a term for companies that value, well, anything more than their customer does: "bankrupt". And the customer DOES NOT CARE about security. You can prove me wrong by providing me a url where I can buy a smart phone from RIM. (I am not in government. I am a plaintive retail customer wannabe.)

    3) Given 1 but especially 2, it is management's job to get applications shipped before they are bug free, let alone secure by any vaguely meaningful metric. And while I am certain that there are a bunch of junior programmers that are not really aware that their code is going out with "issues", it is beyond belief that any swe does not understand that the code which is shipping is generally nothing but an embarrassment.

    ---

    If some company wants to hawk some tool that can find a bunch of issues with code cheaply, that's great. But don't blame devs for the following fact:

    The customer is giving money to the first company that gives them stickers. No stickers today? No money today or tomorrow.

  12. Anonymous Coward
    Anonymous Coward

    Easy answer is to force refund on any purchase of insure software

    If you buy it and it turns out to be insecure you can claim your money back and the code becomes public domain until it is fixed, this is the only thing that will stop companies releasing insecure code. It typically isn't down to coders it is down to what coders are allowed to do by those that hold the purse strings.

    So making code security the difference between a company getting paid or going bust is the only way forward because currently releasing insecure code is seen as both acceptable and the only way to do business .

    1. doublelayer Silver badge

      Re: Easy answer is to force refund on any purchase of insure software

      That's a very glib and simplistic response. A lot of security problems are due either to management/marketing refusing to let people patch insecure things or to programmers who don't know what they're doing. I don't mind some responsibility being forced on them. However, writing something without security problems is basically impossible, and if you actually want that you're going to be waiting a long time for everything and paying a lot of money. Since that's been tried and it doesn't work, there's only so much you can demand before it becomes unreasonable.

      Whenever security problems are mentioned, somebody comes along and suggests something like this. Usually it's along the lines of "don't allow anyone to sell something with security problems". You can have that now if you like. You just have to hand me all computing equipment you have and never buy any more. The result for you will be the same.

      1. Anonymous Coward
        Anonymous Coward

        Re: Easy answer is to force refund on any purchase of insure software

        The simplicitity of the method is it virtue, release something that compromises machines and you loose control of your code. Easy to remmeber when anyone says "nah lets just release it and sort the problems out later."

        I have given up waiting for developers to be held acountable for their failures that in any other field would have resulted in big payouts.

        The only way to stop insecure code being shipped is to make it expensive to do so, screw going to court that is a pay for justice scam, instead make things so simple even the retarded management can understand it. Make them understand that half doing the job means you loose your business.

        1. Charles 9

          Re: Easy answer is to force refund on any purchase of insure software

          But how do you ENFORCE it?

        2. Charles 9

          Re: Easy answer is to force refund on any purchase of insure software

          "The only way to stop insecure code being shipped is to make it expensive to do so,"

          To elaborate, capitalistic pressures actually work in the OPPOSITE direction. It's usually expensive to NOT ship secure code because, more often than not, those that don't do it can cheap out and undercut the competition (because the clients lack the mentality to prize security--they're in the same boat of cheap first and to hell with the rest). And for those that get caught, they either just vanish or find it cheaper to lawyer their way out of it. And because of competitive sovereignty, there's usually a way out of any problem if you have the resources. You'd practically need a Machiavellian Prince to force them to comply, and they could probably just pool together and arrange a coup anyway.

          The TL;DR version: Security will always be an immediate cost, meaning it will be prioritized to be reduced (it's just physics--think door locks). And the human condition favors reducing immediate costs versus considering long-term risks.

  13. MachDiamond Silver badge

    Why = Money

    There is very little downside to releasing insecure code or hardware. At the most, companies that get spanked wind up paying their customers back with a period of credit monitoring (I've got credit monitoring for life at this point). The chances of them having to do that much is pretty slim. It's one thing to see if customer's money has been pilfered from their bank accounts, but an exploit may allow somebody to rummage through corporate docs or files from R&D. How might that loss be quantified if the information is used to gain an advantage over that company. An insecure IoT device that gives somebody a way to use a person's internet connection to download some movies and then the victim gets a settlement offer that's uneconomical to fight might be another case. It could also be worse if the person commits a crime. How much would you have to pay an attorney to defend yourself against charges for something you didn't do. At hundreds an hour, even showing you aren't involved can quickly be thousands.

    There needs to be a bunch of liability for releasing poor software. If a company can show to a court that they have their software evaluated by a qualified third party and a protocol in place internally to find and fix security issues, perhaps that can mitigate some of the fines. If companies are playing fast and loose, it should be an offense that might close the company down. It's awfully rare that there is a piece of software that has no competition so the argument that the market will will be harmed irreparably is likely false. I also expect that very narrowly focused software with a limited customer base and no competition is much less likely to be exploited.

    1. Jimmy2Cows Silver badge

      Re: Why = Money

      Where do you draw the line between poor software, and tried really hard to check as much as possible but stil missing something?

      I've been doing this 30-odd years and while I'm always generally improving, I'd never claim my code is completely bug free, vuln free, 100% tested and audited.

      We have design reviews, unit tests, functional tests, smoke tests, security audits with hundreds of rules covering a wide range of possible vulnerabilities. Can that cover anything? Hell no. 100% coverage is impossible, and chasing it rapidly hits diminishing returns.

      Occasionally something still sneaks through, or a new problem is discovered that none of the checks and audits could find as it wasn't even a recognised problem.

      How much mitigation would that provide us under your "bunch of liability"?

      Some places a happy to bang out shitty code. A lot of places are not. Quality is important as a driving factor for customer adoption and retention. Other places sit somewhere in between. It's a continuum, and painting the situation as black and white feels deliberately disingenuous. Or should that be ingenuous...? I guess either can work. This wholy black and white perspective is certainly naive.

    2. Missing Semicolon Silver badge
      Mushroom

      Re: Why = Money

      Bang on. Why spend money on security when the cost of a breach is effectively zero? TalkTalk? Increased profits. Experian? What hack, I don't remember a hack.

      Both of these companies should be smoking holes in the ground.

  14. FlamingDeath Silver badge

    Managers

    Managers are for the most part but not always, sycophants who are eager to please higher ups, shareholders desire for a get rich quick scheme

  15. rwbthatisme

    I think there are multiple reasons for insecurity, but I think one of the biggest problems is actually developers aren’t trained or just don’t know about or understand basic security. I interviewed a computer science graduate this week who was doing a masters in some computer vision cobblers and he had no knowledge of any basic security when I asked him about sql injection. We really need to sort out the basics and bake in security from the ground up and get courses to properly include it in the curriculum.

  16. HAL-9000
    Childcatcher

    Life is full of surprises

    Who knew the public that consume software are the new alpha testers

    1. Jimmy2Cows Silver badge
      Coat

      Re: Life is full of surprises

      Microsoft...?

  17. aerogems Silver badge

    I blame management

    A few years ago I worked for a medical devices manufacturer who shall rename anonymous to protect the guilty. I'd love to name and shame, but the company is known for retaliation and I don't want to cost anyone their jobs. They made IVD (clinical: used with actual patient data) and RUO (research use only) devices. During the time I was there they appointed a new president of the business unit who decided that they were going to launch new products on a given date come hell or high water.

    Shortly after this person became president the company was set to release some new flagship product and I remember a bunch of us literally scrambling to process all the work needed to officially launch the product while they were starting the launch party just down the hall. I found out later that a customer was threatening to cancel a large order if they didn't get the units by a specific date, which is why they decided to "launch" the device that day. Probably not even six months later the product was a total flop and quietly killed off maybe 18 months after launch. This was an IVD instrument.

    Then there was another case where, probably the best engineer in the business unit, was going around to anyone who they could track down in a hallway to tell them that the product they were working on wasn't ready to launch. They completely ignored the person and launched it anyway. This was also an IVD instrument. It was also a flop in the market.

    I also saw the company selling "pilot" build instruments to customers as if they were finalized units on multiple occasions just so they could meet ship dates for sales contracts. But the company leadership didn't want to hear anything about any of that, preferring to turn a blind eye to things, and would retaliate against anyone who would rock the boat too much. They kept pushing the engineers to go faster and faster and you could see it taking a toll in the number of shortcuts they would try to take. It was part of my job to not allow some of those shortcuts, but I felt for the engineers and the pressure they were under.

    I've seen too many cases where the people "on the ground" who are actually building the product say it isn't ready, and we're not simply talking perfectionists who think nothing is ever ready, but are then overruled by someone higher up who just brushes aside all of their concerns. So unless the dev is intentionally putting remote exploit bugs into the code, I'm inclined to deem them largely innocent.

    1. Charles 9

      Re: I blame management

      Perhaps it all becomes a case of staring down the avalanche. If inaction means your last big client pulls out and you're essentially done, desperation sets in. Expand that, and you may find towns or even larger may hinge on deadlines, meaning livelihoods, even outright lives are at stake...

  18. Dave 15

    Well well well

    Seems senior management are happy to take the pay, the credit and the promotions when nothing goes wrong and want to shift blame to others when things go wrong. This is a surprise is it not?

    The 'deadlines' dont come from developers, they come from project managers, from senior managers. They are so so so so so important that the slaves (sorry developers) are forced into eating stale takeaway pizza at 9 in the evening while they struggle to do what they have said since the very beginning is impossible. Is there any surprise at all that if someone else hasnt spotted a problem they will keep quiet after doing an 80 hour week for the last 2 months?

    If only people would listen to the people 'on the front line' and use the knowledge available a lot wouldnt be so bad.

    BTW, more adverts in the UK this week wanting hugely experienced devs for less than you would pay a tesco checkout operator, tell me why do you think they should give a hoot about your product when they are too worried about how to pay the rent which eats up 90% of their income?

    1. Charles 9

      Re: Well well well

      How is that going to happen when everyone wants everything yesterday or they'll go to the competition? What do you do when your livelihood pretty much depends on chasing unicorns?

      1. martinusher Silver badge

        Re: Well well well

        At the management level the game becomes a version of "pass the parcel". Its the illusion of success that gets you your next job so the goal is to not be the person in charge when the project goes belly up.

        People who become project managers are often those who had a relatively short career as a developer. They may have been good at the job but they invariably change track at the point in their development where they think they can do anything, typically in the mid to late 20s. Since they get the job because of their 'people skills' they have a tendency to want to flatter their superiors, to make them look good, which sets the stage for wildly over optimistic project schedules. Project management tools that were originally used for efficient airfield building during WW2 don't help** -- warning signs that you're in trouble are schedules that are fine grained to the developer hour (the 'staircase look') and the constant search for 'magic programmers' rather than making the investment in team building.

        Occasionally you find a project manager that knows what they're doing and knows how to work with a team. They're like gold. Unfortunately I've never found a way of propagating them.

        (**Where did you think all these Gant Charts type tools originated?)

  19. Dave 15

    While we are having a rant

    "Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customise your settings, hit “Customise Settings”. "

    Is it just me or is it bloody annoying hat I have to either accept everything el reg wants to vomit on my computer or laboriously change it... it is NOT against the law to allow me to say NO to all your damned cookies, nor actually against the law to get rid of the damned things altogether. This is like security to m, doing the minimum or even going against the spirit in order to avoid doing more than you absolutely totally must do is the problem with a LOT of management and developers.

  20. arachnoid2

    Door Locks

    Software security is like a door lock. Some are purely functional on a visual level and some are designed more secure ( but may not be), as ever ways round are eventually found to unlock them. The very fabric that holds them in place may itself be insecure so the type of lock used may actually be irrelevant. Some people even leave their doors unlocked at night because "they live in a safe area".

    Double glassed doors have been fitted with 3-Point Door Locking System for years and still are, yet they rely on an easily defeated Euro barrel and the foam door panels or the glass panels are easily removed.

    The point is you can keep supplementing your door locks with push locks, dead bolts or door stops/bars but at the end of the day there has to be a level of security that is a compromise to the actual "perceived threat level".

    Its more a case of what level of security is needed in your case

    A: a fudge ( security camera in operation label)

    B: adequate

    C:: overboard (block the doorway up to such an extent people stop using it because of the hassle involved)

    Another example would be a gps tracker "anti theft device" on a vehicle .

    A: They do absolutely nothing to prevent theft

    B: Are easily defeated by career criminals

    C: rarely result in return of the vehicle and its contents intact

    Yet sales of these devices is making an upsurge on the back of rising numbers of vehicle thefts in the UK.

    When said and done how many on here have an easily defeated Yale or Euro lock on one of their access doors at home?

  21. hoola Silver badge

    Simplistic View

    The trend for a number of years has been Agile. All that matters is delivering stuff fast and if you are lucky fixing the worst of the bugs as they go. Nobody is interested in proper testing or security because it slows down the release cycle of shite that is spewed out. Automated testing is seen as progress but it never appears to occur to people that if you test to get results and the people who wrote the tests also developed the software, it is a recipe for failure. This is overlooked because all the test usually pass.

    Management are happy because they see lots of stuff happening and can honk on about how productive their developers are. Security and infrastructure teams get increasingly marginalised because they are seen as blockers to the business of release stuff quickly.

    Even where there is really high regulatory compliance there are still errors.

    Often the only time that back-end teams get involved is when it is too late and there has been a breach. Then it is all about closing the stable door however this is only in the short term because very quickly we are back to square one and so the cycle continues.

    Mostly companies get away with it because the resulting holes in security are not sufficiently bad to lose customers money directly. Very occasionally something like the BA fiasco hits the news.

  22. Howard Sway Silver badge

    80 per cent appeared to be shifting the blame to developers for not doing their job correctly

    So who hired those developers? Who manages those developers? Whose job is it to identify and correct fundamental problems within the organisation? To provide the resources needed to produce the products and services that the company depends on?

    This figure really shows that 80 per cent of managers are clueless about modern IT enabled business, and have failed to understand that what is required from them is a lot more than "Ooh that looks really cool on my phone". But why bother learning what you need to learn for this century, when you can just chuck the blame around for mistakes caused by your own lack of involvement in the hard work, whinge about how useless these geeks are and concentrate on the really important stuff : thumbing through "executive manager" magazines and feeling like you've really made it.

  23. Anonymous Coward
    Anonymous Coward

    "Then there’s the cloud applications angle, with Dynatrace research finding that 71 per cent of CISOs aren’t fully confident that code isn’t free of vulns before going live in production."

    WOW, WOW, here. Let me get this: 29 % of CISOs think all is well with the El Cheapo developed apps, that have 0 security requirement ?

    Like security is the norm, growing up on trees ?

    Geez.

    Mandatory XKCD is calling: https://xkcd.com/327/

  24. EnviableOne

    Blame to go round

    there is plenty of blame for the current situation, and it can be flung at all levels.

    The issue is how to fix it and it needs a multi-threaded response

    Education: Teach people secure coding, stop them from writing insecure stuff in the first place.

    (don't tell me this is done already OWASPtop10 hasn't materially changed in a decade)

    Enablement: Allow time to properly write secure code, don't allow it past checkpoints if it isn't secure, tested and documented.

    Enforcement: make companies criminally and financially responsible for anything that is lost by exploiting their insecure code.

    1. Charles 9

      Re: Blame to go round

      Trouble is, each thread already has roadblocks:

      Education: No one wants to learn. You can't fix Stupid.

      Enablement: The check-cutters don't care and will fire anyone who dares defy them at the drop of a hat.

      Enforcement: They also likely have the clout to evade or change any kind of enforcement you can think of.

      IOW, the only people with the actual power to make it happen either don't care or are actively against it.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like