back to article Complexity has broken computer security, says academic who helped spot Meltdown and Spectre flaws

Complexity has broken cybersecurity, but a reappraisal of computer science can keep us safe. So says Daniel Gruss, assistant professor in the Secure Systems group at Austria's Graz University of Technology. Gruss and his colleagues discovered some of the biggest recent security snafus, including the Meltdown and Spectre …

  1. Snake Silver badge

    Hmm...

    So it takes an "expert" to state the obvious before people will believe in it, even though I've been saying it for more than quite a while now. It is not a surprise, the only "surprise" is that this, being so obvious, wasn't an acknowledge truth throughout the tech world - the arguments of "this system is more secure that that!" would have been far less...astringent...because fundamentally no system is "secure" as we simply can no longer examine all possible system interactions of hardware, firmware and software. The best that we, as failure-prone humans, can strive for, is "best try".

    1. Anonymous Coward
      Anonymous Coward

      Re: Hmm...

      But as things get more and more complex, the risks involved with them will only increase, to the point that the overall risk (especially when taken in compound again and again) starts becoming too high to ignore? Plus, all these systems are raising the potential damage should something get through, meaning even a low risk can't be ignored if the damages from that risk occurring are high enough to worry.

    2. Version 1.0 Silver badge
      Meh

      Re: Hmm...

      While I'd agree that no system is "secure" there are systems that have virtually never been hacked, e.g. CP/M, RT11 and RSX11M etc. One factor that the article mentions is that these were small systems, it's a lot easier to secure a system that's 10k, 100k, or 5Mb, than an operating system like Windows 10 needing 15Gb to install. The bigger the operating system is then the more opportunities for holes.

      Even if the operating system is 100% secure, you can't install an application, or access the internet, without risking completely compromising your security.

      El Reg - I'm still waiting for a new icon (a pair of wire cutters) to define security!

      1. doublelayer Silver badge

        Re: Hmm...

        There's another problem, which is that some of the old operating systems you mentioned or could mention were never subjected to the attack landscape we now force our operating systems to undergo. Early operating systems basically ran themselves and one program, so they didn't have to care about multiple programs running together. Even when they did implement this, the major concern was making sure those programs didn't crash or modify the state of each other, not securing one's operations against another one. Memory protection came along and we took steps to insulate one process's memory from another one, but again this was more to make code correctness easier rather than a primarily security-focused decision. Even in the late 90s, Windows gave every user local admin, with the concept of user accounts only serving to help organize things. Now we expect a secure system to do so many more things that make it harder. We expect a hypervisor to run VMs with total privacy between them (sometimes between them and the hypervisor). We expect that a multi-user operating system will only allow a user and root to access any of the resources of that user. We even expect that processes run by the same user will be sandboxed from some of the resources available unless a separate permission is granted, and it is the OS which has to provide for and ensure that. The operating systems we used to have were not only much smaller, but they also were not expected to provide the kind of security we need today.

        1. Peter Gathercole Silver badge

          Re: Hmm... @doublelayer

          Your comment about single task operating systems was correct about CP/M, mostly correct about RT/11, but completely wrong about RSX/11M. It was a multi-user, multi-tasking operating system, complete with memory address space isolation and pre-emptive scheduler, together with real-time features.

          But it really does help when the people designing the OS also had quite a lot of input to the hardware as well.

          However, even RSX/11M was not immune from errors. It certainly was not 100% secure or bug free. The huge pile of patch bulletins that we had for the system I looked after definitely proved that.

        2. Paul Crawford Silver badge

          Re: Hmm...

          Even in the 70s the idea of multi-user systems with security between accounts was pretty normal in UNIX and mincomputer/mainframe world. The growth of the PC using DOS/Windows 16-bit was a major step back in they had no real security, but that came with Windows NT/2000 series that took over.

          The biggest change and threat has been the web, as in the "old model" of computer administration the superuser would install trusted programs and the OS ensured they were separated per-user when run. Now we have web browsers running arbitrary code from $DIETY-knows where and scrambling to stop them exploiting the holes in the browser & OS to do bad things.

          And don't get me started on the whole IoT crap and evert fsking product having a web server in it (routers, printers, web cameras, etc) that are never patched...

          1. doublelayer Silver badge

            Re: Hmm...

            You are correct about that, but even the mostly secure Unix and other multi-user operating systems weren't designed for situations where one user would need very strong isolation from others. Random examples still exist of this; it's still possible, for example, to read the command line commands another user enters. This is usually not critical, but it's an example of one of the previous parts of the design where security really wasn't a factor. Other examples exist, such as when passwords were really stored everybody-readable or the ability to have a file exist with permissions inconsistent with those of its containing directory. These all are small and relatively unimportant, but compared to now when we're trying to limit processes' disk access inside a user account, they look a little anachronistic. The main reason that we don't care much about the few of these that remain is that multi-user systems are used less frequently; our personal machines usually only have one user account logged in at any time (assuming they even have more than one) and most other systems use VMs for small sets of people rather than one big system with open login for the whole institution.

            1. A.P. Veening Silver badge

              Re: Hmm...

              most other systems use VMs for small sets of people rather than one big system with open login for the whole institution.

              I am afraid you are overlooking the midrange and mainframe computers, but there that part of security is usually pretty solid.

              1. doublelayer Silver badge

                Re: Hmm...

                They're also used for different tasks nowadays. In decades past, normal users might do most of their work by logging into that from their basic terminals or terminal programs running on relatively weak computers. They'd do standard work on that system, which was also running the important software since the mainframe was the only large system available. Now, the mainframes are still used to run the large software projects (some of the time), but users usually don't have to log into that mainframe to read their corporate email or access intranet-type services. This means that a lot of users probably don't get accounts for the mainframe, and therefore the worries that an unprivileged user will find a way to attack it or allow someone else onto it to do the same are reduced. It doesn't make it perfect, but it does make it easier.

          2. Brewster's Angle Grinder Silver badge

            Re: Hmm...

            "Now we have web browsers running arbitrary code from $DIETY-knows where and scrambling to stop them exploiting the holes in the browser & OS to do bad things."

            All told, browsers do a bloody good job. When was the last time a web site could steal files on your hard disk or download and launch an EXE? (And I don't think we can blame browser manufacturers for architectural flaws - like Spectre and Rowhammer - which anyway they've taken steps to block once they became apparent.)

            Most bad things can happen because web sites necessarily have access to the network, and so can exploit security flaws in other websites. There are privacy issues which are a hard problem in any environment. And there is the social engineering of tricking people. But given how much unverified software is downloaded and run every minute of every day, they are ridiculously successful at keep everyone safe.

            Don't believe me? Ask the crims. Tricking people to opening malware attachments remains their mainstay.

      2. Steve Davies 3 Silver badge

        Re: Have an upvote

        for mentioning RT-11 and RSX-11M. Back in the day, I spent many an hour/day writing device drivers for both those OS's.

        The new model seems to be 'security through obscurity' has morphed into 'security through complexity' and the downside is that so many developers really don't grok writing safe, secure and unbloated code.

        'Hello World' is not a 5Mb (or more) executable.

      3. ThatOne Silver badge

        Re: Hmm...

        > there are systems that have virtually never been hacked

        Don't confuse correlation and causation, computers are a very recent (historically) thing, and in the beginnings there were little to no real threats: Somebody might have wanted to prank you, but there wasn't a whole, well-organized criminal industry making a living from hacking yet.

        Of course you're right that complexity increases the attack surface, but the chances of somebody trying have increased way more dramatically in the last decades.

        1. Doctor Syntax Silver badge

          Re: Hmm...

          "Don't confuse correlation and causation"

          But don't ignore it; ask how it comes about.

        2. Doctor Syntax Silver badge

          Re: Hmm...

          "too high to ignore"

          You have to wonder just how high that might be for some people.

      4. Dan 55 Silver badge

        Re: Hmm...

        In a single-user operating system with no memory protection (i.e. a fancy bootstrapper and executable program launcher) there's no such thing as an attack anyway, the OS lets you run any program which can do anything it wants and that's by design.

        1. doublelayer Silver badge

          Re: Hmm...

          Mostly correct, but this is not always the case. The perfect example of this is a hypervisor. The program it runs is specifically another operating system, but the purpose of the hypervisor is to provide resources for the program and restrict it from affecting things that aren't in its virtual environment. Or, at a different level, operating systems for embedded devices often run one program but still restrict it from doing certain things. For example, an OS I'm using for a small device handles Bluetooth for the application running on it, and so as much as the program that is loaded may want to modify the memory reserved for Bluetooth protocol operations, the OS will not permit it to do so (at the moment, it alerts the developer on its own, kills the program, and loads another one allowing a debugger to be attached).

          1. Caver_Dave Silver badge
            Boffin

            Re: Hmm...

            And there are certifiable hypervisors that allow the integration of all the old separate boxes in a system into one, with differing levels of certification for each guest!

            For instance VxWorks - Helix™ Virtualization Platform

    3. a_yank_lurker

      Re: Hmm...

      No system that has connections with the outside world is truly secure; some are more secure than others. This even true in biology and the paper based world. At best you have a defense in depth that is sufficiently robust to give you time to react and stop the attack.

    4. Smartypantz

      Re: Hmm...

      Of course, this is the case. Anybody with half a brain can see that the digital worlds mechanics is merging with the well known mechanics of biology (essentielly the same at a certain degree of complexity). Both systems, due to the sheer amount of random inputs, concours to the ancient laws of ecology. Eventually cyber threats to our digital infrastructure will be treated in exactly the same ways biological threats to our organism is treated.

      1. ThatOne Silver badge
        Devil

        Re: Hmm...

        > Eventually cyber threats to our digital infrastructure will be treated in exactly the same ways biological threats to our organism is treated

        By denial?

    5. fidodogbreath

      Re: Hmm...

      So it takes an "expert" to state the obvious before people will believe in it

      That's the basis of the consulting industry, after all.

      1. Doctor Syntax Silver badge

        Re: Hmm...

        Of course. Experts charge more so that those who know the price of everything and the value of nothing pay more attention to them.

    6. Blackjack Silver badge

      Re: Hmm...

      Software bloat has been a complain that's almost as old as software.

      KISS is unfortunately almost always ignored.

    7. runt row raggy

      Re: Hmm...

      by today's standards, hacking cp/m is trivial. all you need is to swipe the floppies. there was no encryption.

  2. Doctor Syntax Silver badge

    "the complexity of computers and networks now approaches that of structures, organisms, and populations seen in biology"

    For some value of approaches.

    1. DoctorNine

      Well, some approaches have no value at all. At least according to certain female friends of mine.

      I'm sure there is some biological reason for this datum.

    2. Blackjack Silver badge

      When people used to talk about surfing the Web they didn't think it was like the sea.

  3. Captain Kephart

    As has been said for some time ...

    In 400BC Herodotus said "If one is sufficiently lavish with time, everything possible happens" ...

    Which somehow feels relevant? Anyway, great quote!

    Ciao, Captain K

    PS: On those formal systems, anyone read Gödel's 'Incompleteness Theorem' recently ... it proves, rigorously, that absolute security was NEVER achievable ... see: https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_theorems

    1. Paul Crawford Silver badge

      Re: As has been said for some time ...

      I don't worry about absolute security being unachievable.

      I worry about pi55-poor security being the norm.

      1. ThatOne Silver badge

        Re: As has been said for some time ...

        > I worry about pi55-poor security being the norm.

        Actually "no security" is the norm. Because security is not an investment, just good money thrown out of the window for no reason at all. Ask any bean counter.

        Of course you risk losing money without security, but as any gambler will tell you, people "feel lucky". So good luck convincing the bean counters to spend money preventing an incident they know won't happen to them (to others maybe, but certainly not to them).

    2. Brewster's Angle Grinder Silver badge
      Holmes

      Re: As has been said for some time ...

      Incompleteness doesn't say perfect security is impossible. Incompleteness is just *waves hand* the halting problem.

      There are some programs we can prove will terminate, some programs we can prove won't terminate, and some programs we'll never be able to prove one or another. Writing a provable secure program means limiting yourself to programs that we can prove will halt (or using only statements that we can prove are true.)

  4. Anonymous Coward
    Anonymous Coward

    Seems that this article prizes hindsight.....just what we need!!!!

    Quote: ".....possible to make a system provably secure – with great effort ...."

    *

    No mention of the relationship between risk and effort? As in -- the higher the risk, the greater the need for security resources.

    *

    Which begs the question: Is anyone doing comprehensive risk analysis? Say at Intel? Or at Amazon? Or at Equifax?

    *

    Just saying!

  5. John Smith 19 Gold badge
    Unhappy

    Echoes Harlan Mills, "Software Productivity" 1988.

    Page 84

    "OS/360, as a multilanguage processor, seems better regarded as a "natural system" than a rational system at this point in time.

    "If we regard OS/360 as a natural system, like a cow, we are in a much healthier mental condition"

    Saying something truly new in the IT business is hard.

  6. jvf

    mmf mmf

    Wow! Someone managed to write another complete article using the bullshit generator.

  7. Mark192

    KISS

    As hardware becomes cheaper and faster, could we tailor the parts in a system to make it less capable, in order to reduce complexity.

    The aim would be to reduce attack vectors by making many things be simply absent from a particular part in the system (they're either not needed or the system can do it's job sufficiently well without it).

    1. Charles 9

      Re: KISS

      No, because then you run into necessary complexity. Performance demand is macroeconomic, as the article notes, and people won't be willing to go back. Even if you flash something like Turing's Halting Problem disproof, they'll just fire back, "Then build a HYPERcomputer!"

  8. Julz

    Complexity

    Hasn't broken computer security, computer systems have been very complex for a very long time. The uncontrolled interactions of components has. Building the software/hardware tower has always relied upon the behavior of sub-components being deterministic, predictable and to spec. In that way the components further up the stack can be designed and tested on solid foundations. What has happened is that these foundations are no longer as secure as you might hope, not to mention the woeful lack of testing. There are many reasons for this but the main drivers would appear to be economic ones, closely followed by competency and the lack of appreciation of how hard the task is.

    You get what you are willing to pay for and we don't seem to be willing to pay what would be needed for secure systems. Or to put it another way, we are not willing to pay the cost of systems that would actually do what they say they should do and plump for the good enough option, hoping that the compromise doesn't bite us in the bum later.

    1. Man inna barrel

      Re: Complexity

      > You get what you are willing to pay for

      Security, for most people, is like insurance. You do not really want to pay for it, as it does not do anything useful. Well, until something bad happens, that is. Backups have a similar status as far as most users are concerned.

    2. Anonymous Coward
      Boffin

      Re: Complexity

      I think you are making the mistake of saying that because a number is big it can't get bigger.

      Saying 'systems have been very complex for a long time' is like saying 'hydrogen bombs are very bright'. Yes, they are, but a hydrogen bomb going off pressed against your eyeball is less bright than a supernova observed from as far away as we are from the Sun ... by a factor of a billion. (xkcd).

      So yes, systems have been very complex for a long time, but during this whole time the complexity has been increasing, probably approximately exponentially, if not more aggressively even than that).

      1. Julz

        Re: Complexity

        A long time ago they reached the point of complexity such that one or even a small number of people could not full grok the interactions of all of the components. Adding more complexity after that point makes no difference to the problem.

        1. Charles 9

          Re: Complexity

          But at the same time, you can't simplify it, either, as a lot of that complexity has become necessary. The biology angle seems appropriate here, as it's a lot like our current virus issue. We can't stop it, and even containing it is proving problematic, and in the meantime people are dropping and livelihoods are getting shredded. Technological security may well have become a dilemma for which there is no satisfactory solution, yet people are too hooked on it to go back barring an utter catastrophe (the "typo kills ten million people" kind of catastrophe).

  9. Anonymous Coward
    Anonymous Coward

    Security investment

    Or maybe "investment in security has been relatively low when compared to investment in other features"?

    1. Charles 9

      Re: Security investment

      Probably because RoI is so low in most places. Unless you're in a secure-or-die environment, it's cheaper to just lawyer and BS around any faults.

      1. Doctor Syntax Silver badge

        Re: Security investment

        Unfortunately the costs of failure are still insufficiently high. Some of the breaches we've seen ought to have brought down the companies. It's not hard to think of a few who ought to be remembered only in MBA courses as case studies in failure. I can only think that C-suite members simply think "There, but for the grace of God, go I" and continue doing business with them.

  10. John Smith 19 Gold badge
    Unhappy

    And using turing complete protocols doesn't help either.

    Turning complete protocol --> Full Turing machine to run.

    So it's a computer regardless of what it's rendering/moving/etc

    So it's vulnerable to a)Security flaws in the security model of the protocol (if it has such a model to begin with) b)Security flaws in the protocol generally c)Security flaws in the processor security model d)Security flaws in the implementation.

    And once they are exploited you have an arbitrary process running loose inside your machine.

    And then there are those f**king "management" processors that the mfg trusts but which cannot be audited.

  11. Binraider Silver badge

    Admitting that your systems are so complex you cannot possibly hope to control every risk is the first step. Getting the supply chain to do what you need is the second. Old, simple systems have their advantages, and in critical functions we should be looking to those examples.

    1. Charles 9

      Until you find out that complexity is necessary: possibly because of a need for agility or so on. Simplicity doesn't always work: otherwise, we'd all be driving cars with rotary engines.

      1. Doctor Syntax Silver badge

        "Until you find out that complexity is necessary: possibly because of a need for agility or so on"

        The simpler things are the easier they are to change, let alone change safely. If "agility" makes something more complicated by adding another layer it's likely it was already too complicated.

        1. Charles 9

          Point is, it may not be possible to simplify things to the degree you desire because something turns out to be there for a damn good reason. Just like how reciprocating engines seem to have beaten out rotary engines in cars (mainly because they're easier to tune to the complexities of the real world: a necessary evil when you're required by law to eke out the last drop of oomph from your fuel tank).

  12. John Deeb
    Holmes

    subject is considered a formal science?

    "computer science needs to rethink itself. Today, he said, the subject is considered a formal science. Gruss said that needs to change, for two reasons."

    This only makes sense to me when it would read "considered NOT a formal science". You don't want to downgrade it from science, right?

    Complexity might have broken this article too then. A simple proofreading could help? The whole things reads a bit like a class at the University of Obviousness. Security is just a complex word like freedom, it effects psychology, technology, policy, equipment and politics alike. This way it might for ever fall outside pure science. Like criminals fall outside laws or find a way to try at least.

    1. Anonymous Coward
      Boffin

      Re: subject is considered a formal science?

      By 'a formal science' he means something like mathematics, where there are axioms and proofs, and experiment us not needed. You don't need to do experiments to show that there is no algorithm by which a Turing machine can determine if an arbitrary Turing machine program will halt, say: you can prove that starting from a bunch of axioms, the way Turing did

      Compare this, say, to physics, which is an empirical science: in order to do physics you start by doing experiments, and build a mathematical model which purports to describe those experiments, and makes predictions about other experiments. Then you keep doing experiments and comparing their results with what the model predicts. At the point that the model disagrees with experiment, it's wrong and you need a new model.

      He wants CS to become more like an empirical science, I think.

  13. Anonymous Coward
    Anonymous Coward

    This.

    Putting systems on the cloud is like taking all bathrooms and loos out of houses: people will crap on the pavements and wonder why they can smell each other.

  14. Doctor Syntax Silver badge

    Security is apt to be traded off for something else and not necessarily cost, at least not direct cost. In the case of Meltdown, etc. it was traded off for performance. Another trade-off is often convenience.

  15. Brad16800

    If you want to go down the biology thinking, no system is secure. Current world situation as an example.

    It's all just best effort.

  16. Brewster's Angle Grinder Silver badge

    I've always thought the first, truly sentient computer program will be a piece of malware - or perhaps a network of malware. It's creation will be entirely accidentally.

  17. SNaidamast

    The article brings up some rather salient points regarding the security of systems. The problem however, is you cannot make such systems secure by adding ever more complex technologies to the endeavor. The answer to this issue is to return to more secure processes, which are far more difficult to breach than current systems are.

    For example, the rush to do banking online via mobile devices has become a watershed for cyber-criminals wanting to breach people's accounts. Many use such options over public WiFi networks where people with sniffers can literally scoop up data over the public airwaves and then simply use brute force techniques to crack the security at their leisure. Instead of simply waiting to use such online services from a more secure connection at home, people simply ignore such safeguards to do their banking at their convenience. But whose convenience?

    Another aspect of this overall situation is the priority of companies that all applications must go on the Internet\Intranet. Instead of prioritizing low-risk applications as those that can go ion the Internet, while high-risk applications are developed using binary protocols within client-server infrastructures on localized LAN networks, companies simply ignore such thinking for convenience.

    In the end, the problem is not the technologies but how they are used combined with the lazy proclivities of people who simply want convenience over minor impediments to their daily lives. If people were to make some minor changes in their lifestyles, you would probably see a lowered number of breaches taking place. However, in a world gone mad over the latest technological refinements made available, this will most likely never happen leaving security researchers to continue chasing after a silver-bullet to the world's woes...

    1. Charles 9

      So what do you do when your job depends on containing Dave? Given how many Swords of Damocles are lying around, it may be reaching a point where tackling Dave may be the only way to keep things not reasonably secure, but simply secure enough to JGSD, especially in a world where spare time for many people is dipping into the negative.

  18. Anonymous Coward
    Anonymous Coward

    Termination

    The problem isn't the systems, it's imposible to secure everything, the problem are the crooks. All crooks should be sent to suffer in a Gulag in perpetuity, no more access ever to IT.

  19. Claptrap314 Silver badge

    So tiresome

    Hearing people yammering about using biological systems as a model for computer security.

    Evolution does a reasonable job in developing DNA which is successful in perpetuating itself. But if you look at the larger picture, most species average less than one in two producing offspring. One in a hundred is not unusual. I would call that a bad model for security.

    But it gets worse. Much, much worse.

    Take the deer. With excellent hearing, speed, and eyesight, they represent a significant difficulty for predators. Humans work in teams to exhaust them.

    Or, you know, to develop weapons. Spears, bows, guns, .... rifles. Which we can produce by the millions.

    Random processes do a lousy job when faced with custom-designed countermeasures. Just don't.

    1. Charles 9

      Re: So tiresome

      "One in a hundred is not unusual. I would call that a bad model for security."

      But the point is, evolution tends to take everything into consideration to get the odds right. 1-in-100 odds? Use Strength in Numbers, but you notice that doesn't happen for larger animals like cows. Ruminants tend to have stay on the hoof, thus their young are born live and tend to be able to get up and move quickly afterward, for that reason.

      "Take the deer. With excellent hearing, speed, and eyesight, they represent a significant difficulty for predators. Humans work in teams to exhaust them."

      Not just humans. There's a reason wolves, coyotes, and related predator species tend to work in packs. Even lions tend to gang up on their prey. Thing is, they don't tend to carry their food around with them, and one maybe two tends to keep them full, so it's not like they herd huge numbers off cliffs; doesn't fit their style.

      "Random processes do a lousy job when faced with custom-designed countermeasures. Just don't."

      But they have the advantage of time and numbers. As long as they stay ahead of the curve, things tend to work themselves out. Deer aren't that endangered yet, are they?

      1. Claptrap314 Silver badge

        Re: So tiresome

        We're talking about an individual deer trying to keep his business from collapsing. He does not care if small business continues to survive as an industry. He wants to be able to feed his own personal family on the basis of his own personal business not being burned to the ground by a hacking attack.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon