back to article Security is an architectural issue: Why the principles of zero trust and least privilege matter so much right now

I’ve been interested in architecture – of the physical building variety, as distinct from computer or network architecture – for as long as I can remember. So I was pretty excited when I got to work in a Frank-Gehry-designed building at MIT in the late 2000s. As it turns out, the building is something of a case study in the …

  1. Anonymous Coward
    Anonymous Coward

    Excellent...............

    ..................but do people (like the NSA, GCHQ) actually support this approach?

    *

    Central policy......distributed implementation......all sounds fine until you start to wonder about powerful organisations out there who will want to subvert both the policy and the implementation............you know......bad actors, pornography, "our right to snoop" and so on.

    *

    Legally (or not) "backdoors" implemented in "standard security software"..........as well as in encryption..........all in our future!!!!

    1. Robert Carnegie Silver badge

      So

      You have open specifications and implementations of security, to address this.

      ...once you get past having government-issued security WITH government backdoor (Clipper), and treating effective security products as restricted weapons and a crime. The thing with PGP for instance. And also, knowing about any of this.

  2. Cliffwilliams44 Silver badge

    The concept of Zero Trust and LPA have been around for a long time. They are not the purview of the network hardware or software per say but really the purview of people! When people, i.e. users have access to things they really should not, this is when you get into serious security issues. Companies having their entire company wide data-set encrypted by malware because 1 user got compromised! When access is requested the question should always be "Why do you need it?" if the answer is "I need it because I'm important!" it should be squashed with the biggest hammer you can find!

    Also the beginning of this article made me cringe! It sounded like he was advocating some over-arching security system be built into the fabric of the internet! I was glad to see that was not the focus of the article. The last thing we need is some faceless organization or governments holding the keys to internet security. And also holding the crypto keys to all our communications. Big Brother ya know!

  3. Anonymous Coward
    Anonymous Coward

    Gehry-designed Stata Center

    That building pictured in the article looks like someone folded the blueprints in half instead of rolling them up.

  4. Claptrap314 Silver badge

    Zero trust was new in name only

    First of all, if you did not have deny all as the first line in your firewall in 2008, you were an idiot.

    Secondly, while the US military formally had Eyes-Only only for TS stuff, it was well-known that obtaining even Confidential information that you did not need was going to result in unpleasantries.

    But after-the-fact & bolt-on security is really poor practice. See u$ if you want a really...colorful demonstration.

    For instance, the use of DNS in intranets was horrible practice before the Solarwinds breach. Now it is inexcusable. But how many companies have even started the conversation to fix that?

    Security at every layer, and every stage, by architecture, is the only way to even hope to get this right.

    It's not cheap, though. Guess what that means.

    1. Potemkine! Silver badge

      Re: Zero trust was new in name only

      the use of DNS in intranets was horrible practice

      Question: how do you resolve names on intranet without a DNS?

      1. Claptrap314 Silver badge

        Re: Zero trust was new in name only

        DNS is a really good solution for two independent organizations to cooperate. Providers point their data towards the root, and consumers start their search from the root. But for an intranet? That's a completely different problem.

        You don't need DNS. You need to know where to go to fulfill your dependencies. When that data is being pushed, there is no traffic on the network to check that it is still there, and you are not limited as to how fast updates can take effect.

        You control the entire environment. There are many ways to do it. Here's what comes to my mind:

        1) When an app is deployed, it gets two command line arguments: a) The list of IP addresses & ports to hit for its config information and b) it's identifier. It looks for a working address and does a GET on /config_me?identifier. /status returns REQUEST_CONFIG.

        2) The config server looked up the identifier, and confirms that it is coming from the expected IP address. It then does POST /config with the entire config object for the app. It hits /config on the app with the config object, including all of the address/port combinations for the various dependencies. /status on the app now returns CONFIG.

        3) When local config is complete, /status returns SEARCHING. When enough dependencies connections report OKAY, /status returns OKAY.

        4) When the controller observers OKAY on the app, it does a PUT to /config/dependencies that have app as a dependency adding that server & port to the list.

        Any changes to the network affecting the dependencies of an app are communicated to the app via PUTs to /config/dependencies with the relevant changes.

        1. Paul Hovnanian Silver badge

          Re: Zero trust was new in name only

          "Any changes to the network affecting the dependencies of an app are communicated to the app via PUTs to /config/dependencies with the relevant changes."

          This assumes that you want the people moving network cables/DHCP services around in the network closet to have that sort of access to your app configuration server. Sometimes we were lucky to get them to make the DNS entries correctly.

  5. swm

    ARPANET Security

    When Xerox in Rochester, NY added a node to the ARPANET I logged into the bridge machine and noticed several daemons running (accessible from the ARPANET): telnet, who etc. I asked if these were really necessary and, if not, stop running them. "They" thought about this and stopped the daemons.

  6. Yes Me Silver badge
    Meh

    Buzzword-based networking

    Oops, I meant to refer to "intent-based networking" but it seems to have been auto-corrected. Please point us to the standard for "intent".

    That said, yes, starting out by trusting everybody to be well-behaved isn't a good idea, but requiring everybody to jump through security hoops every 5 minutes is such a bad idea that no money-making service provider will ever do it.

    If I'm not mistaken, zero trust and least privilege really originated in MULTICS, although described differently in those days. You have to ask why they didn't catch on 50 years ago. I think the answer is the same as why most people don't use 2FA most of the time. It's just too much hassle. I'm not optimistic.

    1. Chris Miller

      Re: Buzzword-based networking

      It's not simply the hassle, it's the time and money costs of security, too. "Security", for most people/businesses, is "that which prevents me from doing my job/prevents us making money". There are organisations where absolute maximum security is a valid goal - mostly in government, where the inevitable inefficiencies matter less - but for commercial organisations security is always (and should always be) a trade-off. The first question is always " how much security do we need (or can we afford)?"

      The argument of the security professional is: "if you think security is expensive, try having a breach". Our role is to help organisations identify threats and the appropriate mitigation measures (which in some cases may be "do nothing").

  7. IceC0ld

    from a personal perspective, it seems to me that the future of internet access WILL be a 2 tier beast, with corporate / enterprise / govt / military actually implementing a full on SDN security model, and the basic home user, being left to play catch up, and this is not necessarily a bad thing, the worst user is the one who thinks they know something, and will not listen to thoughts on 'improving' their persec.

    the best user will most likely be people like our good selves, we will work in an environment where security is THE basis for all actions, and we will take that to heart, and hopefully there will be available for us, software / hardware to allow us to implement some sort of security by default at home too

    it will take time, but I feel that as we have already had almost 40 years now of the old way, that maybe, just maybe, the appetite to make things safer is here, and hopefully the C level people will loosen the purse strings, as they finally come to realise that IT is no longer the necessary evil, but is in fact the cornerstone of ALL any company can ever hope to be

    as I said, from a personal perspective, I have only been in IT since 2003, had 30 years as an electrician prior to that, so I AM old, and I fear I will not actually be around to witness the great change, I CAN, however, hope :o)

  8. Anonymous Coward
    Anonymous Coward

    "approaches such as Google’s BeyondCorp in which there is no concept of a perimeter, but every system access is controlled by strict authentication and authorization procedures."

    I think you'll find that once you break into their SAML server, you wil be able to become anyone and access everything, everywhere.

  9. Duncan Macdonald

    Nice idea but...

    A common requirement for many jobs is to be able to look up something on the internet. (Examples - where can I get widget X, where is the parts diagram for item Y, where is the recipe for food Z.) By their very nature these queries have no predefined list of nodes with the required information - a query might start with Google then branch to a list of suppliers (quite possibly including eBay and Amazon). Some larger firms might be able to afford the costs of staff members having two computers (one (secure) on the internal network and one (insecure) on a separate network) - smaller firms have to allow some staff members general internet access at work - this requires that those computers have the best affordable protection software.

  10. Paul Crawford Silver badge

    Oddly with the limited number of IPv4 addresses we ended up with NAT as the default for home routers and most small businesses, that automatically made "default deny" the standard for incoming connections. Of course that only lasted until we has UPnP breaking it for any dodgy software running on the user's PC, or the design goal of IPv6 offering access by default for ever device in existence.

    And this highlights one flaw in the idea of authentication access to the network, as soon as someone's PC (or other device) is compromised it gets their access credentials, and often that is done via pull-requests now (email or web site malware) and so it can do the same to everything they have access to. So while such network rules might help reduce a free-for-all in the LAN, it really is not dealing with your typical ransomware attack for small business or home users. For they they need a immutable copy of important files, and some means to wipe and re-install the machine(s) impacted by it. The cloud-based accounts on offer promise this, but at what cost in on-going expense and in privacy?

  11. Anonymous Coward
    Anonymous Coward

    And here I was thinking that Zero Trust...

    ...was just a marketing buzzword.

    And I thought that Firewalls were just another approach to authentication. And Security Groups was just another word for Firewalls.

    The problem isn't that the Internet doesn't have lots of Security pixie-dust built in. The problem is that few people take the time to secure things because it's hard work. Zero Trust (and anything else) won't fix the human laziness problem.

  12. JBowler

    100% agreement, 100% doubt

    Yes. Oft repeated, never learned. It's the same as the message in object oriented programming, objects have their own accessors which limit what can be accessed, slightly modified with capabilities, to use the term I learned years ago. To access a method you have to have the appropriate credentials.

    In real human interaction this is the the ultimate bureaucracy, yet in the control of machines simply a reasonable approach to ensuring they don't stamp our own fingers with the word "pass", or do, depending on your point of view.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like