back to article Apple, Microsoft, PayPal among 35 organizations compromised by evil twin dependencies attack

Bug hunter Alex Birsan last year managed to compromise the software supply chain of 35 companies by exploiting packaging mechanisms used by JavaScript, Python, and Ruby developers. In a write up posted on Tuesday, Birsan recounted how he managed to distribute proof-of-concept malicious code through the npm Registry, PyPI ( …

  1. RM Myers
    FAIL

    What a massive fail

    And running a script seems like a bandage on a bandage. The dependency on outside packages, libraries, and blindly copied code has really gotten out of hand in recent years.

    1. Aitor 1

      Re: What a massive fail

      It is absurd to allow external repositories in dev machines.

      Point the devs to an internal repo.

      1. Anonymous Coward
        Joke

        Re: What a massive fail

        > Point the devs to an internal repo.

        That's no use! You need to point the devs to a StackOverflow article that tells them how to point to an internal repo...

      2. Stuart Castle Silver badge

        Re: What a massive fail

        But, but, Cloud! The beancounters absolutely don't wan't us running on internal systems because it costs.

        1. Anonymous Coward
          Anonymous Coward

          Re: What a massive fail

          Coming from the SysAdmin world before I started Dev, hosting the repos internally makes no difference for several reasons:

          1. You still have to trust the upstream repository at some point, so unless you're planning on writing every library yourself the risk still exists.

          2. The rate at which bug and security fixes come into public packages make it just as much a risk to host an internal mirror and have to wait for "approval" to update those packages meaning we're back to the original issue - write vulnerable code waiting to allow for package updates on the internal mirror or write your own libraries..... which also isn't always a possibility depending on the vendor you work with.

          3. In the case of Rubygems at least, ALWAYS EXPLICITLY specify where to download the private package from rather than assuming that Rubygems/NPM/PyPI/etc will figure it out for you.

          1. Claptrap314 Silver badge

            Re: What a massive fail

            Sure--if you are running a pure mirror, there is almost no real gain. See my reply below as to exactly what & how to get real gains.

            1. Anonymous Coward
              Anonymous Coward

              Re: What a massive fail

              What you suggest requires more time and effort than most companies have available. I know my team doesn't have time to maintain that, nor do our IT or (internal) Infrastructure teams.

              Additionally, the primary issue present in the findings from the hacks revolved around conflicting/duplicate names used in private packages that match public packages and using the laziness of a team where they don't explicitly define where to get their private packages from. Your solution doesn't solve that problem, if anything it just leads to dev teams relying more on local dev environments than on the build system. There are hundreds of tools that exist now to allow a developer to setup their own dev environment without EVER needing admin access to their local system.

              Finally, if you really think that companies that currently operate simply by giving their developers business requirements and let those teams make their own determinations what tools, libraries, languages, etc to use to meet those requirements. Those organizations would lose a ton of developers as we do not like being told HOW we have to do our job.

              1. Claptrap314 Silver badge

                Re: What a massive fail

                Okay, I'll bite. What proposal do you have to ensure that, say, a ruby shop is able to continue to do its builds the next time that rubygems gets taken down? When there is another semver failure?

                Devs must be able to explore. That requirement does not imply that the build process should be vulnerable to nonsense.

                Yes, there is a cost to everything. And I specifically called out that smaller shops might not be able to do a full review of every source that they use. But not even having the source? Irresponsible.

    2. Falmari Silver badge

      Re: What a massive fail

      Not my area Javascript, js.node etc but it all seems risky when building grabbing a new version of a lib from somewhere when you build.

      I work in C# .net C++ C and yes, we sometimes use external opensource libs nowadays most likely got through NuGet. If one is needed the dev will download and code with it if it does what they want and is approved*, it will be added to source control. From then on until it needs to be updated that is the version that is used in the build process. What is shipped is every file that is needed and an installer of some sort.

      Because we control when a new version is used, we can test it also any bugs that are found the changed lib is a place to investigate.

      This approach worked well for us in the past, but now we have open-source audit software Black Duck we now get the constant nagging of not using the latest version.

      * No opensource code can be used unless it has been approved by the aptly named opensource committee.

      1. Natalie Gritpants Jr

        Re: What a massive fail

        You are getting constant nagging because those older versions you have copied and used internally may have security problems which are now known to the bad guys.

        1. Falmari Silver badge

          Re: What a massive fail

          Oh yes some have, but it will flag up all that have a later version.

  2. Tom 38

    With python, its dead simple - you push your private packages to a private pypi repository (or pay someone to host it for you, eg artifactory), and configure pip to install preferentially from that repository, with the private repository falling back to the public pypi only when the package is not found locally.

    Here is the stack overflow Q answering that :D

    1. Kevin McMurtrie Silver badge
      FAIL

      That's exactly how the companies got hacked. Developers and/or build systems were tricked into using an impostor library from a public source. All it took was bumping up the version number in the impostor so the "better" one was chosen.

      I know it's a hassle to maintain a private repo of public code, but that's what's needed for security.

      1. Adrian 4

        And for reliability. Your tests were performed using some version of imported code. If you rebuild using another version, those tests are invalid. Just because you're using a later version with supposed bugfixes doesn't mean that there aren't new bugs, or that the fixes don't trigger latent bugs in your code.

        You need to know when you import a new version, and repeat your testing when it happens. Until that point your code is made less reliable by importing a bugfix, not more reliable.

        1. Tom 38

          And for reliability. Your tests were performed using some version of imported code. If you rebuild using another version, those tests are invalid.

          Might want to read about lockfiles.

      2. Tom 38

        Mmm, I have to disagree, at least for python/pypi/artifactory.

        If I have published a package mycorp-foo to artifactory, and I ask artifactory for mycorp-foo, artifactory will respond with the versions that are published on artifactory. It won't say anything about any mycorp-foo packages that are published to pypi, since the package exists locally.

    2. Anonymous Coward
      Anonymous Coward

      Can PyPI not explicitly be configured to get a package from a specific location. That's what we do with our private Rubygems. Pull public ones from Rubygems, private ones are given an explicit URL and errors if it can't find the library at the specified, private location.

      1. DevOpsTimothyC

        There's still a bunch of issues involved with that, think NPM's left pad. What a number of places do is proxy ALL requests through the internal archive. That way you keep a copy of public libraries so you maintain buisiness continuity and avoid left pad type issues.

        The problem is that it does not protect from thos sort of attack as you're still querrying one repo and because most dev's don't want to deal with security issues they will just set the deps to be the latest version, or the latest within a specific release eg 2.x is fine.

        1. Anonymous Coward
          Anonymous Coward

          "The problem is that it does not protect from thos sort of attack as you're still querrying one repo and because most dev's don't want to deal with security issues they will just set the deps to be the latest version, or the latest within a specific release eg 2.x is fine."

          I don't know what devs you work with, but in my experience in both FOSS and proprietary software, we don't ever set our libraries to use "latest" for any reason. Not even containers.

          As for using the "latest within a specific release, eg 2.x" - that's the entire point of SemVer. Major versions are breaking changes, minor are new features, patch are bug/security fixes. I can't speak to other environments, but the rate at which we receive security alerts from upstream libs, we fix them by updating or we are vulnerable. There's risk either way, unless you build everything yourself.

  3. Lorribot

    Most devs have local admin rights or their tools don't work

    It's easy you do this, this, this and this and twidlle this setting over there and its all good. Unfortunately most devs are to lazy or ignorant of consequences. this should all be default and the Devs should have no option because they can't change the settings because they are not not a local admin.......

    Bad people rely on good people being lazy and ignorant and make a lot of money from that.

    1. amanfromMars 1 Silver badge

      Re: Most devs have local admin rights or their tools don't work

      Bad people rely on good people being lazy and ignorant and make a lot of money from that. .... Lorribot

      If that be true, Lorribot, I wouldn't like to be relying on good people remaining forever lazy and ignorant/apathetic and undereducated, whenever nowadays it is so easy to be kept busy and to learn so much more than was ever imagined possible before, about virtually anything and everything.... with the secrets and tricks that deliver something from nothing exposed for exploitation and employment/edutainment and enjoyment.

      And you have World Wide Webs to thank for that, both deep and dark, heavenly and enlightened.

      And some systems are driving themselves bonkers and to rapid catastrophic destruction doing what little they only can do, failing to stop the information and intelligence being more generally widely known and universally available.

    2. LateAgain

      Re: Most devs have local admin rights or their tools don't work

      And my personal pet hate - windows programs that HAVE to run as an administrator for no reason at all

      1. Anonymous Coward
        Anonymous Coward

        Re: Most devs have local admin rights or their tools don't work

        I'd like to know how anyone would stop devs setting up their local envs. They'd have to have no write capabilities on their systems. I don't install system-level packages for development anymore - most of them are too old for the business requirements. So user-level tools like ASDF are used to pull down (in my team's case) multiple versions of python, ruby, golang, node, etc into our user space not at the system level.

        Meaning there is no way to prevent me from doing it even on the most secure system without taking away my ability to do development at all.

  4. Anonymous Coward
    Anonymous Coward

    Lack of authentication....

    .... and here we are again.... if you don't know who the source is, you can't trust what comes from that source. People think encryption is all they need to be secure - but without authentication you are not.

    With proper authentication reliable sources can be pinned and accepted - and any attempt to spoof the source would become evident.

    1. needmorehare

      Exactly, fixing this is easy...

      Just make sure the data is digitally signed twice and the connection encrypted.

      Firstly, by the legitimate upstream developer, each individual file must be checked for the presence of a signature or the files would be considered tampered with. Pinning could transparently occur upon the first installation of any given package and can be self-signed. This would also allow for tamper-proofing of the application itself post-install. For reference, Microsoft encouraged this practice since Windows 98 with Authenticode. It's embarrassing that modern solutions don't incorporate the basics.

      Secondly, the package archive should be signed by the repository owner, so that files can be safely mirrored without risk. Certificates should be manually added to the keychain, just like the situation with RPM/DEB packages on Linux distributions. This has been common for well over a decade in the Linux community, so there's no excuse not to implement and enforce this.

      Thirdly, all connections should be TLS-secured with a trusted certificate.

      At this point, the developer's legitimate source would have to be poisoned, as well as the trusted repository accepting the changes. Companies like Red Hat, Google and Microsoft could work together to provide a vetted set of releases and charge third parties a very tiny fee for the privilege of access... everyone would be happy.

  5. Anonymous Coward
    Anonymous Coward

    This sounds fairly illegal

    It's one thing to look for bugs, another thing to deliberately infect a company...

    Also - he was caught doing this, so technically his "attack" didn't work.

    1. John Robson Silver badge

      Re: This sounds fairly illegal

      Caught *once* out of how many attacks?

  6. Version 1.0 Silver badge
    Meh

    See curation

    This is just modern coding practice, convenience is very high priority while security is something that just gets thought about later - the software management bosses tell the coders that the app's can always be updated later so it's not necessary to check everything now.

    I just checked my phone - it's OK, only five apps need updating this morning, it was ten yesterday.

  7. Pascal Monett Silver badge

    "there's been a 430 per cent increase in upstream software supply chain attacks"

    Of course there has been - it works.

    And, on top of that, you can reap extensive rewards by getting automatically inserted into all of that company's customer's networks as well.

    It's a blackhat dream come true.

    All that because, for the past twenty years, developers have taken the habit of not bothering to check what they're putting on their production servers.

    I don't care how much you trust that Github repo, you do not put code on production servers that has not been vetted.

    Once again, lessons are going to be learned the hard way.

    1. ThatOne Silver badge
      Devil

      Re: "there's been a 430 per cent increase in upstream software supply chain attacks"

      Learned? You're sure about that?

    2. Anonymous Coward
      Anonymous Coward

      Re: "there's been a 430 per cent increase in upstream software supply chain attacks"

      Vetted by whom and how? The dependency chain in many of these cases is so large that the business would never get anything to market waiting for all the code to be vetted. And by the time it was vetted, there'd be a whole new slew of security patches and bug fixes to vet or risk putting............. wait for it.......... vulnerable code in production.

  8. Claptrap314 Silver badge

    As I was JUST saying...

    By the numbers, again...

    1) The build system never has access to the internet. It reads source from a file server completely under the control of your company. Anything else, and you are adding a needless availability risk to your build process. Nevermind the security issue.

    2) The server hosting the source files used by the build server does not run a true mirror of external sources. What the build server sees when it asks for the list of versions of package foo are precisely what the business has decided it should see. Did someone have a hissy fit and delete a package from the public sources? Goody for them, but you keep your copy, thank-you very much. Did someone break semver with their release? You've got that release, but you block it until you decide what to do about it. Has there been a point version update? You have it, but your build system cannot see it until you've done a review. Is there a version out there where an actively exploited CVE is out and the fix breaks your stuff? If the exploit actually cannot be realized against your installation, we can enable the "bad" version and keep running while making the changes to support the fix.

    EVEN IF your company is unwilling to spend the money to review all of the point versions, if your system has the capability, then at least you have the ability retroactively to block bad stuff and to keep what you need to stay in business.

    Notice that I've not mentioned anything about what the devs have access to. Indeed, it is the job of the dev to check out the New Shiny, even if it *gasp* involves Somebody Else's Code. Getting approval to put the New Shiny into the build process, however, should require sanity checking.

  9. Anonymous Coward
    Anonymous Coward

    Ahh the typical human fucktardery I’ve come to expect on this rock full of stupid, I would say it is an “unforeseen” consequence, but one has to look before it can be declared ”unforeseen”

    We’ll just call it a consequence of ignorance

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like