back to article Security by obscurity not so bad after all, argues prof

Security by obscurity may not be so bad after all, according to a provocative new research paper that questions long-held security maxims. The Kerckhoffs' Principle holds that withholding information on how a system works is no security defence. A second accepted principle is that a defender has to defend against all possible …


This topic is closed for new posts.
  1. trarch

    You missed a bit

    Let me just fix this part:

    "Security by obscurity may not be so bad after all ***when used as an additional layer of defense***".

    That's better. Surely this is obvious to anyone, though. since any extra layer of security is a good thing.

  2. Anonymous Coward
    Anonymous Coward



    I don't know of anyone who has said *using* security by obscurity is bad.

    Everyone I know says that RELYING ONLY on security by obscurity without any other security layers is bad.

  3. Dr Who

    If a VIP goes on a trip to a dangerous place do you :

    1) Publish detailed information about precisely when she will be where alongside details of your precise security strategy for each location, transit routes in between and security details during transit. Have all of this information scrutinised by an ad hoc network of independent security experts from around the world to see if they can find a vulnerability. At the same time of course, your adversaries get to scrutinise your plan too.

    or do you :

    2) Not announce the visit until it's already underway. Withold details of the trip from all but those who absolutely need to know, and even then tell them only those details that they need in order to do their job.

    There is certainly a debate to be had. Option 2 could allow the security detail to get sloppy. However, I think when looked at this way, the answer isn't quite as obvious as it first seems.

    1. Kevin (Just Kevin)

      The VIP...

      The point previous comments have made is that in both your cases, there is strong security for the VIP visit. Obscurity makes it more secure. Security PLUS Obscurity.

      The general maxim about Security BY Obscurity is a warning not to believe something is secure just because information is obscure. The comparative metaphor would be the VIP is coming to visit, give her an airline ticket, cabfare and a map of town and send her on her way. And don't tell anybody. She'll be safe, nobody knows she's coming. THAT'S BAD.

      Your Option 2 is clearly better than Option 1 but both rely on the existence of good security.

    2. John Miles

      re: a VIP goes on a trip to a dangerous place do you

      doesn't that depend on the VIP ;-)

    3. BristolBachelor Gold badge

      Tell me, when you have an operation, do you:

      a) see a person who has been acredited by an open place of learning (university), got certificates on their wall, and people checking that what they do is safe.


      b) see a person who says "I know what I doing, honest gov'nor", but who has not been to university, got a degree, certificates on their wall, no-one checking them, etc.

      Now how obvious is the answer? Yes it is possible that just by himself, guy (b) is better than guy (a), and you would be much better there, but how do you know?

      1. Peter H. Coffin

        But that is selection for competency, not selection for security. This goal is not that goal.

  4. Torben Mogensen

    Secrets only work as long as they are ...

    ... secret.

    That is the basic tenet in the idea that security by obscurity doesn't work. But one of the purposes of security is to keep secrets, so you can argue that if you can't keep your security measures secret, you can't keep any data secret. So in some sense security _is_ obscurity.

    What you don't want is that leakage of one secret will reveal them all. If your sole defence is a secret (weak) encryption method, leakage of this _will_ reveal all your secrets. In the same way, leaking one encryption key (to a strong encryption) should not reveal all your secrets. So you need to make a diagram of dependencies: What information will reveal what other information. And if a small set of informations will reveal a large fraction of the remaining information, your security just isn't good enough.

    1. Gideon 1

      Obscurity is security when it is a

      one time pad. Other methods depend on the effort the attacker has to put into cracking it. Obscuring the algorithm doesn't put much impediment in the way of the attacker.

      1. NomNomNom

        the fatal flaw with one time pads is you can only use them once

      2. PyLETS

        One time pad rarely useful

        For this to work you need an alternative communications channel for communicating the pad, which is more secure than the channel you are trying to protect. Works for the diplomatic services which can send trusted couriers on a regular basis to embassies abroad. Can't think of any other contexts where this is useful, and far more where it isn't.

      3. Francis Vaughan

        One time pad

        Which of course brings us back to the key exchange problem - and essentially full circle. Perhaps back to the good old days of a courier with a briefcase handcuffed to his arm.

  5. Gideon 1

    Not a game

    Security is not a game, so game theories aren't relevant. The consequences of the code being cracked are normally far worse than that of a game, and relying on likelyhood of an attacker working it out rather than having to use brute force is poor given the potential consequences of aa successful crack.

    Obscurity will only delay an attacker, they will use better tools to unpick the algorithm, like researchers used a scanning electon microscope on MiFare cards.

    1. Tim Brown 1

      Just to clarify

      When academics talk about 'game theory' they don't mean games as in recreational activities. It just means any scenario in which there are two (or more), parties with differing objectives.

    2. NomNomNom

      I totally agree. As soon as he said security was a game I stopped reading.

      id like to see him try and explain to the families of terrorist victims that security is just a "game"

    3. Anonymous Coward

      That's really funny! - Security is not a game, its serious, therefore game theory does not apply...

    4. Anonymous Coward
      Anonymous Coward


      I think you misunderstand what is meant by "game theory" - I recommend you do some research and then you will find it is far more than just how to win at Connect4.

      Wikipedia is, as always, a good starting point.

      1. Gideon 1

        Try reading Pavlovics "Final Comments". He blithely assumes you can obscure your own algorithms. This is what makes the paper silly junk science.

    5. Eddie Edwards


      It can be modelled as a game, so game theories are relevant. Worse consequences are modelled as higher costs in the game. You multiply those costs by probability and hey presto you have expected cost.

    6. phuzz Silver badge

      The word "Game" in 'Game Theory' is misleading you. While it can apply to games, it's more generally a study of any interactions where there is some competition or contest between the participants. The idea of mutually assured destruction comes from game theory (and the consequences are a bit worse than a code being cracked).

      Go read up on it a bit ( and look past the name.

    7. Anonymous Coward
      Anonymous Coward

      someone doesn't understand game theory.

    8. teebie

      Misunderstanding of terms

      Game theory is not the study of how to win at tiddlywinks.

    9. BoldMan
      Thumb Down

      and here we have someone who doesn't understand that "Game Theory" isn't actually a game!

    10. Drem

      but games can be important

      Game Theory is not just about playing games.

      From Wikipedia: "Within math, game theory reflects calculated circumstances (games) where a person’s success is based upon the choices of others (Myerson, 1991). It is mainly used in economics, political science, and psychology, and other, more prescribed sciences, like logic or biology. While at first designed to investigate contests where an individual does better at another’s outlay, or zero-sum games, game theory applies a wide range of class relations, and has developed into an umbrella term for the logical side of science, to include both human and non-humans, like computers. Classic uses include a sense of balance in numerous games, where each person has found or developed a tactic that cannot successfully better his results, given the other approach."

    11. DryBones

      The Game

      You just lost. ;)

  6. Dodgy Geezer Silver badge

    As a security professional...

    ...I have been saying this to my clients for many years.

    But I haven't been publishing that advice, or talking about it at conferences, because I wanted it to remain secret...

  7. Daniel 1

    Security by being obscure, more like

    The argument holds true, in a target-rich environment, where the only thing to be had, from a given computer, is a collection of photographs of naked people doing athletic - but strangely unemotional - things, to one another, plus the login details to the likes of Facebook or Twitter.

    If, on the other hand, your computer is full of blueprints for stealth helicopters, sonarless submarines, and death-ray satellites, and is housed in a concrete bunker armed by the kinds of people that only Gordon Freeman can take down, then it is no longer obscure, and all the old arguments apply.

    The professor's argument boils down to a wilderbeest defence: stay in the herd and don't look weak enough to count as dinner. However, some of us aren't wilderbeest.

  8. Exit Stage Right

    Nothing here to see folks, move along, move along....

    Defense in depth...

    ...make it hard(er) for someone to breach your defenses and hope they go and find easier pickings elsewhere.

  9. Mad Mike

    Quite Correct

    The prof is quite correct in what he says. Encryption per se is useless. As technology gets better, you have to use longer and longer encryption keys and more computationally intensive methods to ensure brute force can't work. This is a battle the defender will always loose, especially when dealing with items that need a long service life, such as smart meters etc.

    People keeping missing something very important. What one piece of information does an attacker need to brute force encryption, no matter how complex? He needs to have a way of determining when he has cracked the encryption. If he can't work out he's cracked it, he can't know to stop and will simply move onto the next key. So, the secret with ecryption is not to make the key longer, but to simply create data packets where it is almost impossible to determine when it is decrypted!! This, in essence, is security through obscurity and will work regardless of technological advances. The big mistake companies make all the time is to encrypt too much information in one go and therefore give people the chance to determine they've succeeded through looking for words etc. If you encrypt shorter packets of information, this becomes harder. Additionally, using XML or any other standard that uses primarily clear text is an issue as this removes large numbers of permutations. For instance, if a number is held as digits, the vast majority of options are removed as a decrypted version must be digits only. However, if it's held as binary, all options are in play.

    Too many security professionals these days use the simple, thoughtless processes rather than putting themselves in the hackers shoes and seeing it from their point of view. Stronger and stronger encryption algorithms with longer and longer keys is not the way to go. Security needs to get smarter, not simply longer and more complex.

  10. PyLETS

    If a system has many instances the attacker has access

    A security system with few instances is more likely to benefit from obscurity than one with many. Kerckhoff's principle assumes with many instances it becomes inevitable the attacker will be able to acquire and reverse engineer one of them.

    So security then has to rely on key management, and the ability to preserve the security property of the system by rekeying it.

    It also makes sense for the good guys to be able to peer review and discover weaknesses before the design is finalised, given that the bad guys can later if a failed attempt is made to keep the system is kept obscure. But obscurity prevents widespread peer review.

  11. Anonymous Coward
    Anonymous Coward

    Password Prompts

    The perfect system would then present the attacker with a password prompt and take every step necessary to present the attacker with what looks like a brilliantly designed system that uses what seems to be a 2056-bit key.

    Little does the attacker know that the password prompt is a fake. Password? There is no password.

    Now _that's_ security.

    1. DryBones


      Next day's headline, "No Password Exposes All Classified Data"

      1. Anonymous Coward
        Anonymous Coward

        Obviously sarcasm needs to be clearly identified as such.. Noted

  12. Anonymous Coward
    Anonymous Coward

    Re: As a security professional...

    But, of course, this paper is saying security by obscurity and game theory approaches are good ideas precisely to deflect attention from the real security techniques which remain obscured by the publicity over this ... then again, maybe that's what they want us to think!

  13. Chemist

    Obscurity hinders

    I rely on a complex 20 digit password for access to my SSH account.

    BUT I also use a non-standard port and and only allow access to a very unusual username. Every bit helps even if it just hinders most of the automated probes

  14. NomNomNom

    "Pavlovic compares security to a game in which each side has incomplete information."

    Well he's certainly being OBSCURE so that much is true. Why doesn't he tell us what game he is talking about rather than giving us useless clues?? It could be anything a MMORG or an FPS. In fact all games have incomplete information.

    And is it really right that these so-called "academics" are allowed to play any computer games at work on the taxpayer's dime?

    This would NEVER happen in the private sector. Imagine it - "What's that? hackers have just broken into our database and stolen all the customer passwords?! Well hang on guys let me play minecraft for a bit and develop a game theory about it."

    would NOT happen

    1. Hand1e

      If you're not trolling then you should probably get over this whole game theory thing - did you not read all the posts above?

      And not all games have incomplete information. Poker, yes (you don't know what cards the opponent has); chess, noughts and crosses, no (you know all the pieces he has, what moves he can make, etc).

      1. NomNomNom

        that's even worse. if he means poker then he's gambling too, gambling with taxpayer money to boot and all the games you mention require opponents - especially poker which might mean the whole department was at this. at least if he was playing quake no-one else is necessarily involved.

        chess is considered more acceptable because its meant to be more work than fun although its still a game and it still means at least two academics are involved now in this scandal.

        as for incomplete information you are wrong. In chess you do have incomplete information because you don't know where the other player will move or whether they will just resign. same with naughts and crosses you dont know where the other player will move. Not even on the last turn sometimes ive seen players just give up before the end so u can never tell

    2. J.G.Harston Silver badge

      This is either a very good joke, or you don't realise how many millions of $currency companies like Tesco, Morrisons, etc. spend on Game Theory calculations of what the other one is going to do. "If we drop prices, if they also drop prices, we'll lose money, but if they don't drop prices we'll make money, but only if we drop prices by a certain amount, which depends on whether and by how much they drop prices, and we might actually make more money by /raising/ prices, as long as they don't.... etc....."

      1. NomNomNom

        "you don't realise how many millions of $currency companies like Tesco, Morrisons, etc. spend on Game Theory calculations of what the other one is going to do"

        I don't understand what you mean. They buy and sell products, they aren't gaming companies. Sure they sell games is that what you mean? I don't know why people insist that by simply prefixing the word "game" with the word "theory" then gaming becomes a matter of work rather than entertainment.

        ""If we drop prices, if they also drop prices, we'll lose money, but if they don't drop prices we'll make money, but only if we drop prices by a certain amount, which depends on whether and by how much they drop prices, and we might actually make more money by /raising/ prices, as long as they don't.... etc.....""

        That's not a game. That's business. If they really treated business as a game they'd have gone bust long ago.

        1. Anonymous Coward


          Was it you who wrote the rotting dog blog?

          If yes, then you are a master of Irony, and an unspeakable troll.

          If not, then you need to read more and write less.

    3. Anonymous Coward
      Anonymous Coward

      I can't work it out - are you a troll or an idiot?

      1. Jean-Luc

        >troll or idiot?


      2. Anonymous Coward
        Anonymous Coward

        Does it


  15. Anonymous Coward
    Anonymous Coward

    Obscurity has a more respectable stablemate

    It's called Steganography, and is part of the security toolkit. Basically one can hide the important information within a morass of irrelevant data - very hard to crack unless you can reverse-engineer a sample. So, it is used for instance for spy communications, where, if you get a system to crack then you've already got your spy. Granted not so good for a mass produced item. However, consider the case of DECT phones, who managed very well through NDA and custom chip implementations to keep their algorithm secret and in fact it stood for over 20 years - far longer than it would have done if it had published its details. That said, the value of the protected information was also quite low, though an undetectable wire-tap would have been useful to, for instance, the NoTW.

  16. Forget It

    A Jolly read

    I recommend anyone to read the article - it's informal conversation style is quite funny/alarming at times - witness this footnote:

    This is, of course, a blatant oversimplification, as are many

    other statements I make. In a sense, every statement is an

    oversimplification of reality, abstracting away the matters

    deemed irrelevant. The gentle reader is invited to squint

    whenever any of the details that I omit do seem relevant,

    and add them to the picture. The shape of a forest should

    not change when some trees are enhanced.

  17. brakepad


    Trolling, or seriously just completely misunderstanding the concept of Game Theory?

    1. NomNomNom

      capitalizing it won't help, that's just an argument from authority.

      Explain to me in simple terms exactly what game the academic in this story was playing to devise his theory and we'll go from there.

      1. Francis Boyle

        The post is required, and must contain letters.

        -> letters

      2. Turtle_Fan


        As respectfully as possible: please get out and go look up the term on your fav search engine.

        Or better still, stick to playing games....

        (So much for not feeding the troll....)

        1. Tom 13

          @Turtle_Fan: Please no! For the love of God No!

          Those of us who both play games AND understand what is meant by Game Theory really don't want him around. His type have been cast out even by the outcasts.

  18. Hand1e

    There was no mention of whether the prof considered the advantages of transparency, namely that your algorithm can be peer-reviewed before using it in anger.

    And I can't be arsed to read the paper myself, so there.

  19. Anonymous Coward
    Anonymous Coward

    Different measures for different things

    Kerckhoffs' principle applies to ciphers, and makes perfect sense: The assumption that the adversary knows everything but the key anyway allows us stronger ciphers than otherwise. Point in case: DVD's CSS. Of course, the NSA can through their sheer size provide enough knowledgeable eyeballs to debug and deflaw their ciphers and so doesn't necessarily gain from wider disemination of their algorithms, nevermind that the rest of the machinery using their ciphers does the keeping-of-secrets thing well enough that even a weak cipher isn't that much of a problem. This wouldn't work out so well for most other crypto users.

    That doesn't mean this applies elsewhere, universally. Keeping the internal phone list internal can make it a little harder for an adversary to /social engineer/ an organisation that's aware of such threats. Though most by far are oblivious and then obtaining copies of whatever is not that difficult.

    Or, you know, not talking AXFR with just everybody is one of those basic prudent things plenty of zone admins take, as it does basically no harm and might possibly do some good. On the other hand, just blocking ALL EBIL ICMP breaks a few important things and paints the security admin indiscriminate and not knowledgeable. There's a couple types that you shouldn't allow and that's been known since september 1993 or so, but there's a couple that turn out important as well as plain useful. I'm not afraid to have public systems reply to echo requests (within limits, of course). For the few times it enables useful traceroutes (do not shorten the name, you silly person you) it is worth the neglible risk of someone learning something useful from the reply.

    As to things like vulnerabilities, well, there's the simple "due dilligence team vs. lone exploit searcher" calculation. Turns out that with both fishing the same large enough pool of as-of-yet-undiscovered exploits, the due dilligence team will be fighting a losing battle simply because they have to find at least exactly all the holes the lone searcher does.

    Disclosing may or may not help there, depending on what your goals are, but what's most important is that plenty of companies have mishandled their reactions to reports both publicly and privately so badly that even responsible parties are suffering in that some "researchers" aren't even bothering with prior notice any longer, regardless of the vendor's or software project's track record.

    I haven't bothered to read more than the first few lines of abstract as I didn't like the tone, but the most important thing in the whole debate is that you have to understand just where what principle comes from and when to apply which. Otherwise the discussion will sink into yet another holy war flame fest. I don't really need to read any reports to know it probably will anyway, which is a sad but not unexpected state of affairs in the IT security industry.

  20. Anonymous Coward
    Anonymous Coward

    The professor

    Is obviously an idiot publishing to publish just for the 'credit'. He says nothing meaningful at all if you read his BS.

    People like this in academia is a major reason for FAIL.

  21. Jacqui


    "Pavlovic argues that an attacker's logic or programming capabilities, as well as the computing resources at their disposal, might also be limited, suggesting that potential shortcomings in this area can be turned to the advantage of system defenders."

    IMHO Business vs crooks is a no-win scenario because there are far to many crooks who will gain commercial advantage for what is to them relatively small outlay in cracking hardware or software security. The cost to business to try and protect against these crooks results in product delays and and costs - the cost benefit ration means the crooks tend to have the upper hand for high volume products.

  22. Graham Marsden

    "make it harder to reverse engineer"

    Harder, yes, impossible no.

    A little story: Many years ago I and friends used to crack copy protection on games on the BBC Micro so we could hack the code for infinite lives etc (not to actually *copy* the games of course, because that's *theft* doncherknow...!)

    As time went on the protection got harder and more intricate, culminating in a version which Exclusive-Or-ed a bit-stream from the cassette (yes, games came on cassette tapes years ago, boys and girls!) against the timer such that any attempt to break into it would change the reading on the timer and thus render the code garbage and pressing the "Break" key would just wipe the memory.

    Of course as soon as we realised this, we figured there was a simple bypass by taking out the chip with the OS on it (yes, a chip with the *whole* OS!) copying it and re-blowing it onto an EPROM but without the code that wiped the memory.

    So we could then load the game, press Break and save the memory giving us full access to the code which we could hack to our hearts' content.

    In other words we found a flaw in Security by Obscurity which, once breached, made all the Security completely redundant.

    The moral of this story is that Security by Obscurity will make life harder for those who want to get their hands on the code, but unless you have something else in there as well, once it's breached, your code is wide open.

  23. Anonymous Coward
    Anonymous Coward


    Firstly Re:NomNomNom. He/She is quite clearly being silly, and does not really believe that Tesco models their business using Quake or Baldurs Gate, etc. OK. Now that that is out of the way.--

    Consider the case of Quantum Cryptography. The public knowledge about how it works, and the public demonstrations at implementing it and then cracking it, lead unequivocally to it becoming more secure, by being better understood, and .. better implemented. When it reaches mainstream use it will be all the more secure, for it's public testing now.

    A research firm working in isolation could not achieve the robust field testing with an 'obscure' / private security system, that is possible with the knowledge being public.

    I submit that security through obscurity in some cases may actually weaken security. Though, if an obscure technique is added ontop of an already existing public security system then it may strengthen it.

  24. Anonymous Coward
    Anonymous Coward

    The argument that increased processing power can beat all forms of encryption is not necessarily correct. Quantum computing will allow many kinds of encryption to be broken, but there is also, for example, the McEliece system, which is invulnerable to that kind of attack.

    Security through obscurity helps, yes. But the point of that is to make it too costly for the opponent to attack in the first place. Consider that for a second.

    Then consider the Stuxnet worm, which attacked software on one single machine, on one side of an airgap, in Iran.

    Game theory CAN be applied to computer security, but it's not necessarily efficient, as the concept of resources over networks is quite different to its real life counterpart. Part of the reason why the concept of cyberwar being bandied about is ridiculous.

  25. Thomas_Kent

    Art of War

    Looks like someone has read Sun Tzu's treatise.

This topic is closed for new posts.

Other stories you might like