back to article Malicious xz backdoor reveals fragility of open source

The discovery last week of a backdoor in a widely used open source compression library called xz could have been a security disaster had it not been caught by luck and atypical curiosity about latency from a Microsoft engineer. Yet the fortunate find has led industry observers to conclude not much will change to prevent this …

  1. Anonymous Coward
    Boffin

    Evan Boehs: ‘Everything I Know About the XZ Backdoor’

    Everything I Know About the XZ Backdoor

    2021: “JiaT75 (Jia Tan) creates their GitHub account ..”

    2022: “In April 2022, Jia Tan submits a patch via a mailing list. The patch is irrelevant, but the events that follow are. A new persona – Jigar Kumar enters, and begins pressuring for this patch to be merged. Soon after, Jigar Kumar begins pressuring Lasse Collin to add another maintainer to XZ. In the fallout, we learn a little bit about mental health in open source.”

  2. Doctor Syntax Silver badge

    There are features of the social engineering - a couple of newly created sock puppets playing good-guy/bad-guy roles as well as "Jai Tan" and it was these pushing for incorporation into distros. Perhaps some automated trawl could be used to look for a similar pattern - it might even be an area where AI could actually do something useful.

    In the meantime something needs to be done to support lone developers and maybe scrutinise maintainer handovers like this as they happen.

    1. Blackjack Silver badge

      What can be done?

      Well, give them money? Organise them in a support Mastodon instance? Gather them all in a irc chatroom?

      1. Doctor Syntax Silver badge

        Money would undoubtedly be a start. Another would be the availability of someone paid by a foundation (I think this would be the vehicle) to take on overload, discuss problems or whatever.

        Perhaps also the entire S/W world also needs to adopt the attitude that something is feature complete and, other than bug-fixes as needed, it should be left alone. A bit of encouragement towards that approach wouldn't come amiss. At present a project that hasn't had recent updates is often regarded as dead and gets denigrated whereas they should be celebrated as not needing updates. Handing out some sort of recognition to such projects could be another line of approach. Dis liblzma need to reach a version 5.x?

      2. mark l 2 Silver badge

        I daily drive Linux on my PCs and I have given monetary donations to open source projects before, but a there really needs to a organized way for all developers to get their fair share, including ones who are maintaining things such as compression libraries. As its the bigger projects that tend to get the most attention and therefore money such Gnome, GIMP, Firefox, office suites etc.

        I am not a dev so libraries such as XZ i wouldst even necessarily know that they existed until stories like this comes about, and even if i did know of them it hard to tell who is maintaining them to donate to in the first place. It could be a lone developer or part of a larger team such as those at Red hat.

        And realistically it would be a huge admin task for end users to have to work out who to send a donation to for all the bits of software they might use as part of a Linux distro,

        1. The man with a spanner

          The only practical way that I can see having any chance of working would be if the distro producers took it upon themselves to eqitably distribute donations to the various sub parts of the system.

          Tricky I know, but how else do you reward Mr/Mrs Unknown-but-critical developer?

          This would at least maintain the open source ethos, whilst recognising the hard work of the un-sung talent from which we all benifit.

          1. Doctor Syntax Silver badge

            The distros would certainly be a channel but it probably needs actual help as well as money. Perhaps an organisation that can take over the role of the sort of manager who keeps users of the techie's back. An organisation with the authority to look at the pressure Collin was coming under and tell them to back off and have the clout to get an offender thrown off whatever platform they're using to communicate if they don't. An organisation that could provide somebody to discuss tech and non-tech problems with.

        2. John Brown (no body) Silver badge

          "As its the bigger projects that tend to get the most attention and therefore money such Gnome, GIMP, Firefox, office suites etc."

          Maybe those projects, as well as the commercial distributors ought to be taking some time to look at some of the upstream code they incorporate and checking that it's doing what is says on the tin. Not everything and not all at once, but maybe put a bit of resource into a rolling program and examining upstream code and project dependencies, especially the more unloved but vital "lone person" projects. We already know that some "tried and tested" code ended up having bugs leaving a gaping hole for over a decade and that PolKit one wasn't the only instance. If I had the skills and time, I'd help, but I don't so I donate where I can.

      3. Orv Silver badge

        Money is a big issue. A lot of open source is run as hobby projects. Burnout is common. It relies really heavily on people who either feel a sense of obligation to keep going, or who believe they need stuff to point to on their resume. It represents a huge amount of unpaid labor on the part of programmers.

  3. Roland6 Silver badge

    “… reveals fragility of open source”

    It also reveals the strength of Open Source and public development.

    Remember this was uncovered because someone outside the relevant project was able to investigate, rather than simply submit a bug report.

    The investigation was crowd assisted, as others were able to look at the audit trail, albeit we discover not all of it is in the public domain and it is not necessarily reliable as it (the history logs) can be tampered with.

    This has some serious ramifications on less open development processes and specifically closed source development.

    Yes it reveals fragility, but it is also a weakness that can become a strength if correctly and publicly handled.

    It is also clear that developers, such as Lasse Collin, of widely used packages, should be getting some meaningful support and remuneration. Obviously, this brings us back to people actually paying real money for the privilege of using Open Source…

    1. coalabi

      Re: “… reveals fragility of open source”

      Fully agree. The problem is not open-source; on the contrary. The problem is the lack of true support and contribution to open source. We are all too happy to be able to use quality open-source software but how many of us really contribute, one way or another (this applies to a large extent to me). I'm convinced all the required profiles to significantly reinforce the security of open source are available among open sourcd consumers ... We should all do a bit of soul searching instead of letting anti-open-source actors denigrate the concept ...

    2. StrangerHereMyself Silver badge

      Re: “… reveals fragility of open source”

      Yes, and there are many more eyeballs looking at the source code than you may think.

      There are many outside maintainers who rely on the software for their own Linux distributions and software packages. Those maintainers will sometimes peruse the code too. I'm pretty sure maintaners at Ubuntu, Debian or other distro's would've pulled out red flags pretty quickly if that Microsoft engineer (yes, Microsoft uses Linux too!! And in a big way!) hadn't identified it.

    3. Michael Wojcik Silver badge

      Re: “… reveals fragility of open source”

      this was uncovered because someone outside the relevant project was able to investigate

      Yes, and that's good. But since we have no way of knowing how many similar attacks haven't been uncovered, we don't know whether the cost of the modern millions-of-dependencies-of-uncertain-provenance approach to software development is repaid by the benefits of being able to examine source. It's not valid to claim this as a win for open source because we don't have enough information.

      I like open source myself, but as it's presently deployed and used in the industry, it has become a moral hazard. Having a large proportion of the systems connected to the Internet relying on a component with a bus factor of 1 as part of a critical system service is madness.

      There are going to be a lot of these Nebraska attacks. There probably already have been a lot of them.

      1. StrangerHereMyself Silver badge

        Re: “… reveals fragility of open source”

        The biggest risk is having hundreds of dependencies in a security critical application like SSH. They should attempt to limit this to only those which have a large developer community. Single-maintaner projects should be shunned.

        Like I mentioned earlier complexity too is a killer which increases the attack surface and makes takeovers possible by infiltrating some lonely developer project dependency.

        1. Claptrap314 Silver badge

          Re: “… reveals fragility of open source”

          Oh, you mean like having the init process require a complex interface that leads to downstream organizations patching in otherwise unneeded libraries that might just happen to become malicious?

      2. Max Pyat

        Re: “… reveals fragility of open source”

        Moral hazard is a good term.

        Part of the moral-hazard, in my view, is the use of open source by large commercial undertakings without adequate contribution back to the developers and community.

        The commercial use of the tools vastly increases the motivation of malign actors to compromise the software, and therefore the appropriate defensive resources that should be applied. If the commercial users don't contribute to the securing of the tools, still via open-source development, they are in fact doing worse than free-riding, since they attract attacks to the entire community without the entire community getting appropriate additional benefit.

      3. midgepad

        A moral hazard

        Is something different.

        This is a hazard.

        Is it unique to Open Source? (No).

        Is its discovery specially Open Source? (Maybe).

    4. Stoic Skeptic

      Re: “… reveals fragility of open source”

      Hear hear!

      My thoughts go back to the Solar Winds and Kaseya incidents. Something this complex would have never been found in the closed environments there.

      1. collinsl Silver badge

        Re: “… reveals fragility of open source”

        > Something this complex would have never been found in the closed environments there.

        Do you mean "never been found" as in "it would never have existed" or as in "it would never have been discovered"?

        1. doublelayer Silver badge

          Re: “… reveals fragility of open source”

          They mean that it wouldn't be discovered, since both of those are instances where poison code was added to the binaries produced by those companies. I don't think they're correct about that, because the exploits in both of them were eventually discovered, though after they were released and caused havoc. Proprietary software is no guarantee that poison code won't get in, and open source code is no guarantee that poison code will be noticed before it is released. Viewing either as certainly better almost guarantees that you're not thinking the way you need to to prevent it happening.

  4. Graham Cobb

    Some OSS development introspection needed

    This is a timely wake-up call and needs some careful thought and discussion about the lessons to be learnt for software development.

    Of course one major thing, and not new, is that too many widely used projects are understaffed. Maintainers are overworked, can't necessarily review contributions as well as they would like, fall behind on testing and project management as well as actual code development.

    But there are also some important operating system architecture lessons to be learnt. We need to find a way to reduce the attack surface of software, particularly security critical software. Software like SSH needs a simple way to trade performance for safety. In this case we can all see, with 20-20 hindsight, that there is no way a utility package like xz should have been able to affect the operation of a critical tool like SSH.

    We need some of the best OS architects to work on that issue. For example, maybe security-critical software could trade performance for security - maybe something like using RPC and co-processes for external library calls instead of loading libraries into its own memory space. I am sure todays OS architects can come up with better ideas. than this one but it is a task that Linux process loader and kernel teams should be working urgently on.

    1. Phil O'Sophical Silver badge

      Re: Some OSS development introspection needed

      It highlights the same old problem, no programs (and especially not security-critical ones like SSH) should link to other code unless the code linked to has had at least the same level of review and scrutiny as the main program. This was found because someone did the due diligence, but many others don't. In both the companies I worked for where open source was used we were required to prove that we'd done such review before we were allowed to ship a product.

      1. druck Silver badge
        FAIL

        Re: Some OSS development introspection needed

        ssh doesn't link to the infected library, it was injected into the process by systemd - and people wonder why it gets such hate.

    2. keithpeter Silver badge
      Windows

      Re: Some OSS development introspection needed

      Quote from OA

      "...on a machine that deploys a backdoored xz, the SSH daemon ends up loading the poisoned library during startup, via systemd, which alters the operation of the daemon"

      My understanding (which is limited) is that the link to SSH occurred through the use of libsystemd to provide notification of restarting the sshd process. libsystemd depends on a lot of things including xz. Hence the transitive dependence of SSH on a compression library.

      I also understand that it was the distribution packagers who decided to provide the notification function by introducing the libsystemd dependency.

      I'd welcome any corrections to my understanding of this complex situation.

      An argument for less modification at the packaging stage I agree.

      1. Graham Dawson

        Re: Some OSS development introspection needed

        This is exactly the case. This vuln was only possible because of the gigantic, sprawling attack surface that systemd provides and the unthinking reliance on its core libs that it has promoted and created throughout the ecosystem. People have tried to deflect from this, by saying that the flawed version of xz was also available on non-systemd distros, but they deliberately ignore that it was targeted at the link created between systemd and sshd, by maintainers who should have known better.

        It was a sophisticated, long term attack that was only possible because of systemd. It's exactly the scenario people have warned about for years.

        1. Graham Cobb

          Re: Some OSS development introspection needed

          ...attack that was only possible because of systemd

          I am no fan of systemd, but you are mistaken. Systemd was not, in this case, the problem. No more than the compiler, linker, library loader or anything else. If they couldn't use libsystemd as the vector they could have used a similar approach on one of the other dependencies. Maybe it would have taken more effort, maybe less. ldd tells me that sshd is linked to 28 libraries on my system.

          The attack was only possible because of the lack of tight review of all the dependencies of security-critical software, combined with a prioritisation of performance over security in library loading even for the most critical security components.

          1. keithpeter Silver badge
            Windows

            Re: Some OSS development introspection needed

            @Graham

            "The attack was only possible because of the lack of tight review of all the dependencies of security-critical software[...]"

            That was the point I was fumbling for. Thanks for the clarity.

          2. Graham Dawson

            Re: Some OSS development introspection needed

            The difference is in who does the linking, and why. Libraries linked by the sshd developers are audited by them to ensure they're compliant with requirements. libsystemd was linked by people who wanted to tie sshd into a gigantic, sprawling mess of an "init" for the convenience of notifications.

            It's all good and well saying that the attack could have come at any point, but the fact is that this vulnerability was introduced by the long arm of systemd reaching into sshd's internals, where it had absolutely no place being. THAT is the problem. THAT is the thing that people have been warning about.

            1. Graham Cobb

              Re: Some OSS development introspection needed

              the fact is that this vulnerability was introduced by the long arm of systemd reaching into sshd's internals, where it had absolutely no place being

              Exactly. And that was the fault of no-one except the Debian sshd maintainers! They didn't need to do it. Nothing in systemd forced them to do it. Many other systemd-using distributions don't change sshd to use the library. It is obvious now, with hindsight, that it was a terrible decision to weaken sshd by linking with unnecessary libraries without a careful review of the risk/reward tradeoff.

              Systemd has many problems. I don't like it. But it is not to blame for this. And repeatedly saying so just delays fixing the real problems which are:

              1) Helping maintainers of widely used packages keep them safe.

              2) Reducing the risk surface of linking external libraries into security-critical components.

              1. collinsl Silver badge

                Re: Some OSS development introspection needed

                > Exactly. And that was the fault of no-one except the Debian sshd maintainers!

                Worth noting that RedHat are the ones who mainly maintain SystemD, also this attack did make it into Fedora Rawhide, Fedora 40 (beta) and into a couple of other related distros.

                So it's not just on the Debian team here.

          3. elip

            Re: Some OSS development introspection needed

            28 dependencies you say? Sounds like you're using a system guided by a fool's philosophy (as you almost allude to) the symptom of one is including systemd by default. You keep deflecting from this fact, but the truth is, people with years of security research experience have been sounding the alarm on systemd for a better part of a decade...and here we are.

            Here's what sshd is supposed to look like on any sane OS:

            $ sudo ldd $(which sshd)

            /usr/sbin/sshd:

            Start End Type Open Ref GrpRef Name

            000007a716e25000 000007a716f21000 exe 1 0 0 /usr/sbin/sshd

            000007a9a3c39000 000007a9a3e73000 rlib 0 1 0 /usr/lib/libcrypto.so.52.0

            000007a99f989000 000007a99f9a1000 rlib 0 1 0 /usr/lib/libutil.so.17.0

            000007a928576000 000007a928595000 rlib 0 1 0 /usr/lib/libz.so.7.0

            000007a97b991000 000007a97ba8a000 rlib 0 1 0 /usr/lib/libc.so.97.1

            000007aa15be8000 000007aa15be8000 ld.so 0 1 0 /usr/libexec/ld.so

      2. Citizen99

        Re: Some OSS development introspection needed

        "loading the poisoned library during startup, via systemd"

        I wonder if they also (could?) implement(ed) it for non-systemd systems ?

        Non-specialist OAP Devuan user, never going for bleeding-edge releases.

        Edit: posted before I saw other replies...

        1. Michael Wojcik Silver badge

          Re: Some OSS development introspection needed

          The key here is a transitive dependency on the poisoned object.

          liblzma was poisoned. libsystemd has a dependency on liblzma. Debian and some other distributions modify sshd so it's linked against libsystemd.

          The exploit hinges on finding an exploitable system service, and then among its transitive dependencies a component with a low bus factor. That's it. Besides that, the main resource it requires is patience.

          (The actual exploit code is clever, but it's not that clever. People have been experimenting with these sorts of obfuscation tricks for decades. Everyone knows "Reflections on Trusting Trust".)

    3. Anonymous Coward
      Anonymous Coward

      Re: Some OSS development introspection needed

      ... find a way to reduce the attack surface of software ...

      Well ...

      One way would be to get rid of the systemd shitload once and for all.

      All of this would not have happened if it systemd had not been involved.

      .

      1. Graham Cobb

        Re: Some OSS development introspection needed

        All of this would not have happened if it systemd had not been involved.

        I think you meant to say "All of this would have happened differently if systemd had not been involved". The perps created an extremely complex and sophisticated attack based on mispurposing the library loading mechanism to cause an apparently innocuous but actually malicious library to take control of a security-critical component. Given the complexity of what they achieved, I am sure that if they were unable to use libsystemd they would have just found another library as a vector for their malware.

        What we need to do is to (i) fix the development process where important software is reliant on under-resourced developers, (ii) harden the operating system to better protect security-critical components from poisoned components such as libraries.

      2. Orv Silver badge

        Re: Some OSS development introspection needed

        I'm not a huge systemd fan but I actually doubt that's true. We've had other attacks in the past that relied on poisoning shared libraries. (LD_LIBRARY_PATH, in particular, has been the source of a number of problems.) You don't need systemd to do this.

  5. thames

    Almost certainly fake names

    According to the linked blog written by Evan Boehs, the following names are all associated with this backdoor:

    • Jia Tan
    • Jigar Kumar
    • Dennis Ens
    • Hans Jansen

    Almost certainly all are simply made up names which give us no clue as to the origin of this (as noted in the story).

    As for whether this is a problem unique to open source, the same thing can happen with proprietary software by the simple expedient of buying proprietary libraries off the original vendor and then the new owner adding the necessary code. There's little chance of being discovered either, because the source code is not open to inspection by third parties.

    I believe the US has also backdoored proprietary encryption systems several times by simply paying the vendor to do so or by becoming a major investor in the company and then putting their people in charge (this happened at least once, if not twice with companies headquartered in Switzerland).

    Given the extent of outsourcing used in the software industry and the world wide nature of software development, what we have seen in the case outlined in the story is probably the best that can be hoped for.

    What we need is awareness that problems like this can happen so that when fishy looking things can happen suspicions can be raised. Long term involvement in a project isn't a guarantee of trustworthiness either, as you never really know how someone will react if someone else waves enough money in his face.

    1. Spazturtle Silver badge

      Re: Almost certainly fake names

      The actor has also been involved in code relating to China's Loongson CPU. But Jia Tan is not the sort of made up name a Chinese person would make up, so the actor is probably North Korean or Russian.

      1. Androgynous Cupboard Silver badge

        Re: Almost certainly fake names

        Jia Tan is most likely a made up name, but your other statements are a huge overreach from what we currently know. The timezone analysis, if accurate, does not point to Russia and North Korea (neither use DST), and the Longsoon connection I saw theorised late on Friday, but it was circumstantial as I recall - a gitlab page, but I can't find the link now.

        1. Spazturtle Silver badge

          Re: Almost certainly fake names

          The person used a Singapore based VPN based on the IP addresses the commits came from and people can set their computer's time zone to anything.

      2. thames

        Re: Almost certainly fake names

        A bit of googling turns up multiple people named Jia Tan in Canada, Singapore, and the UK, including two different professors at the same university (Cambridge).

        Given the effort put into this project though, it's very unlikely that "Jia Tan" is his real name. If as suspected this is a professionally done job, then there are possibly multiple people involved in writing it and getting it accepted, and several different people could have been "Jia Tan" (and Jigar Kumar, and Dennis Ens, and Hans Jansen) at various times. Whomever was behind it isn't going to risk having a multi-year project go down the drain just because the original person pretending to be "Jia Tan" changed jobs.

        Also, the person who wrote the malicious code likely has a professional background in writing malware, and his real name in that field may be known, or become known, putting the xz backdoor at risk if someone recognized it. It's much safer just to use a fake name that is difficult to trace.

        As for where the name came from, a possible way of getting fake names is to just copy lists of staff names from a variety of major universities in the UK, US, Canada, etc., and pick some names at random. Then google each of those names to see if other hits come up so you know that you didn't pick a unique name.

    2. heyrick Silver badge

      Re: Almost certainly fake names

      "the same thing can happen with proprietary software by the simple expedient of buying proprietary libraries off the original vendor and then the new owner adding the necessary code"

      As has happened a few times with Android apps (maybe iPhone too, I have no reference).

      Anyway, little app gets popular because it does what it does well. Some shady outfit offers the developer a pile of cash for the app (which is a lot to the dev but probably rather less than they'd gain by stealing a single euro from each one of the users), and then releases a version with some snazzy new feature (so people think that the new owners are good) and this is then followed by a poisoned update with malware included.

  6. Michael Hoffmann Silver badge
    Unhappy

    Lasse Collin suspended?

    Just digging through the various threads and post-mortems and found that Lasse Collin's own GH account got suspended!

    Is Github run by morons nowadays?!

    (still suspended as of writing this)

    1. ghp

      Re: Lasse Collin suspended?

      For some six years now: "Headquartered in California, it has been a subsidiary of Microsoft since 2018" (wikipedia).

      1. Michael Hoffmann Silver badge

        Re: Lasse Collin suspended?

        Right, how could I forget.

        One day, if I have the time, I will regale the ElReg community with the story of a failed migration of an entire major bank's IT SCM to a product called Github AE.

        Microsoft's attempt at hosting GHE in Azure. Which was so bad that Microsoft cancelled it - while we were half-way through a mass migration during holiday downtime. 6 months of work for our team and a ruined Xmas holiday season for the team in charge of the actual migration work. All for nought. Though at least the latter got a sweet chunk of OT compensation.

    2. Anonymous Coward
      Anonymous Coward

      Is Github run by morons nowadays?!

      Was there ever a time when it wasn't?

    3. StrangerHereMyself Silver badge

      Re: Lasse Collin suspended?

      Lasse sounds to me like a fictitious name also, but I may be mistaken.

      I believe Github is only being prudent while they check things out. I'm pretty sure his account will be restored once they wrap up their investigation.

      1. Michael Wojcik Silver badge

        Re: Lasse Collin suspended?

        Right. When the story initially broke, this was a sensible move; the account could have been compromised. It's not like they sent the Microsoft SWAT team to kick in his door or anything. And he can easily create a new account if he wants access to GH for some other reason (e.g. masochism).

  7. ldo Silver badge

    Would This Have Been Caught Sooner In Proprietary Software?

    Imagine the perp had got a job inside Microsoft or Apple or some such organization, and tried the same sort of thing. Would they have been caught sooner?

    1. ghp

      Re: Would This Have Been Caught Sooner In Proprietary Software?

      Why would they bother, when all they have to do is find the existing ones?

      1. Doctor Syntax Silver badge
        Joke

        Re: Would This Have Been Caught Sooner In Proprietary Software?

        Maybe it's already happening and that's how the existing ones came to be there.

        I hope the icon's relevant.

    2. Anonymous Coward
      Anonymous Coward

      Re: Would This Have Been Caught Sooner In Proprietary Software?

      It's difficult to tell, since the processus would have been so entirely different and behind closed doors.

      First, getting the job and passing background checks. Then, keeping the job, which requires, well, actual work, for a prolonged period before you're put on something sensitive. Then of course there are your peers and your management, who are paid to spend their days looking at and producing code, which likely include yours too. And of course, from time to time, there are nice people from your own government who drop by for a chat with your management, with their own demands that can't be refused nor discussed publicly.

      1. Doctor Syntax Silver badge

        Re: Would This Have Been Caught Sooner In Proprietary Software?

        "which requires, well, actual work, for a prolonged period"

        No problem in this case. It was a well organised long term con.

        "Then of course there are your peers and your management, who are paid to spend their days looking at and producing code, which likely include yours too."

        Are you impyling some sort of QA? That's what Microsoft's customers are for.

        1. doublelayer Silver badge

          Re: Would This Have Been Caught Sooner In Proprietary Software?

          "No problem in this case. It was a well organised long term con."

          Yes, sort of, but it was an organized one on a small tool like XZ. The attacker wasn't writing code full time to do that. They could spend a bit of time writing something useful on occasion to keep their name in everyone's head as someone who knows what they're doing while spending more time on other things. Working at a company takes more time and thus makes an attack more expensive. You also can't divide effort. Jia Tan could have been a bunch of people. One wrote some modifications, one just worked on the malware, one did the pressure campaign, and they just used the same set of GitHub accounts. You can't do that as an employee of a company because your accomplices don't have access to the internal code and giving it to them is a detectable crime which businesses already try to prevent. Not so expensive that you can't do it, but it reduces the number of attempts.

          "Are you impyling some sort of QA? That's what Microsoft's customers are for."

          I don't think they were implying that. If you're writing code on a team with a lot of people, you have a lot of code reviews and a lot of changes. It makes it harder to slip something in than if you only have to slip it past one person. This is especially the case if you insert your backdoor and I, your colleague, have a feature change to the same area and end up breaking your backdoor while merging your feature with mine.

          The main reason why it's hard is that you don't get to choose your project as closely when you're working for a company. If you get a job at Apple, maybe you end up working on some part of Safari, the iMessage or Facetime protocols, or some core OS component. You can probably put a backdoor in those. Maybe you end up working on the new headline feature they're going to announce next conference: yet another emoji thing that's not actual emoji, the sixth version now. Have fun doing anything malicious when you're writing code for a feature nobody ever uses. It's probably possible, but you don't get to pick a target and specifically add code to that, whereas targeting XZ is as simple as finding where the source for that is and sending a pull request.

      2. thames

        Re: Would This Have Been Caught Sooner In Proprietary Software?

        Microsoft, Google, Apple, etc. all make extensive use of outsourcing for their products. Those outsourcers in turn outsource to yet more subcontractor. All you have to do to get a backdoor into one of those is to either pay off the managers or just buy the subcontractor outright.

        The US have a history of backdooring proprietary encryption systems by simply paying off the right people or becoming investors via a shell company. It would be naive to presume that nobody else has ever thought of doing the same.

      3. druck Silver badge

        Re: Would This Have Been Caught Sooner In Proprietary Software?

        First, getting the job and passing background checks.

        You mean actually look at the name on the H-1B visa?

    3. Michael Wojcik Silver badge

      Re: Would This Have Been Caught Sooner In Proprietary Software?

      It might be fun to speculate, but it's impossible to answer with any accuracy. For one thing, there are, what, millions of organizations producing proprietary software? They aren't all identical.

      There have been cases where authorized developers introduced backdoors and other malware into proprietary software, so obviously it can happen. We don't know how many of those cases have ever been detected. Based on rates of vulnerability detection overall (see e.g. the RAND study on the subject) we can guess that there would be a wide distribution, and presumably deliberate introduction would skew toward longer time for detection, since someone's making a conscious effort to hide the vulnerability.

      On the other hand, some organizations have fairly strict regimes for things like code reviews, static analysis, and so on, which volunteer projects often lack.

    4. JoeCool Silver badge

      Not because of FOSS vs proprietary

      but beacuse there would be people paid to do thinks like : review changes for security and functional issues; maintain the builds; nuke emotional blackmail posts from the dicussion threads. perform independant QA.

  8. MacroRodent
    Boffin

    Complexity

    I skimmed Andres Freund's explanation of how the backdoor was surreptitiously added, and one thing that in my opinion greatly helped the bad guys is the complex build process, typical of autoconf-using builds, where it is easy to hide bad stuff among the ton of shell command snippets and obscure m4 macros.

    Defining how a piece of software is built could and should be much simpler, preferably with a purely declarative control file. Any yielding of control to arbitrary scripts is a risk.

    1. Androgynous Cupboard Silver badge

      Re: Complexity

      Couldn't agree more, and I made exactly the same point on the thread on Friday. My three takeaways from this (and from Log4J, the last big supply chain attack) would be:

      1. There is virtue in simplicity. Expenditure on man hours to remove complexity from both code and process is never wasted.

      2. If you can do it yourself, do. Pulling in a large library when you only need one function means you're not only increasing your risk, but probably the library is sprawling and badlly designed.

      3. Do one job, do it well. For library designers (like myself), don't be tempted to expand the scope. Keep it focused.

      Some corollaries are 4), the next time I see someone on stackoverflow say "just use library X" for a simple task I"m going to give them a sound telling off, and 5), the only projects I know of that use the M4 macro library are autoconf and sendmail, and both should forever be associated with security failings. It's time for M4 to be retired in favour of something legible.

      1. Roland6 Silver badge

        Re: Complexity

        Initial impression is to apply Rust style of checking to the build process.

        Interestingly, my understanding is systems written in Rust use the same build process, hence are also vulnerable to the same shenanigans.

      2. Displacement Activity

        Re: Complexity

        Also agreed, except that I have successfully used M4 standalone in the past. But this makes me think... I build everything from the ground up: C++, JavaScript, HTML, Bootstrap, jQuery if I have to, plain Makefiles, minimal dependencies. There's no way I would ever use a "framework", because there's no reason to. I'm constantly amazed that people are willing to buy into Node/Jakarta/CLI/etc ad infinitum simply because they're bright and shiny and solve some simple problem that has already been solved or could be solved again from first principles with a couple of days effort. 95% of what I see out there is just the old xkcd 15th solution.

        Or maybe I'm just getting old and stupid. Time for a coffee and snooze.

      3. Doctor Syntax Silver badge

        Re: Complexity

        "Do one job, do it well"

        And when you've done that, leave well alone - and applaud others for doing the same instead of denigrating the project for not having updates.

        1. Anonymous Coward
          Anonymous Coward

          Re: Complexity

          "Do one job, do it well"

          Exactly. +100

          How is it that these few and very wise words have ended up being (mostly) ignored by so many?

          If this continues like it is going, soon we will all be up to our nostrils in shit.

          .

        2. heyrick Silver badge

          Re: Complexity

          "instead of denigrating the project for not having updates"

          This, so very much.

          There's a difference between a dead project, and one that is feature complete, tested, debugged, and functional. And you know what? Might even be able to look at the use history rather than the commit history to tell which is which, rather than this bullshit about "it's not been updated in X, therefore it has been abandoned".

          1. Michael Wojcik Silver badge

            Re: Complexity

            One of my work duties is notionally supporting a product which received its last feature update around the turn of the century. There were a couple of maintenance releases in the first few years after that. Since then I've occasionally answered customer questions, but that's all. The thing might not be pretty or fancy by today's standards, but for the handful of customers still running it, it simply Does The Job. I get a query about it once every few years.

            It's been running in production in this fashion for about twice as many years as it was under active development.

            That, to my mind, is good software.

          2. Anonymous Coward
            Anonymous Coward

            Re: Complexity

            Or a combination of the use history and reported-bugs history. If it's often used and has no major bugs reported, who cares if it's been updated recently? It's working properly as designed.

        3. Michael Wojcik Silver badge

          Re: Complexity

          "Do one job, do it well"

          Known in some circles as the YAGNI principle — You Ain't Gonna Need It [where the antecedent is whatever pie-in-the-sky feature someone has just dreamed up].

          It's always tempting to implement some clever idea, or add an optional behavior that someone might conceivably need. Usually better to leave it out until someone actually provides a justification, though.

  9. Missing Semicolon Silver badge
  10. Martin hepworth

    show how hard this is

    Given the fact thats was 2 years in the planning and they got found in less than a month I think this also highlights how actually hard this sort of thing is.

    It didn't help the mitm portion consumed enough extra cpu to make it noticable, even on modern hardware/vm's.

    1. Zolko Silver badge

      Re: show how hard this is

      THIS one was found, by sheer luck (a suspicious sshd activity found during some unrelated debugging), but how many of such vulnerability injections are still out there ?

      1. Zolko Silver badge

        Re: show how hard this is

        What I mean is: "the fact that was 2 years in the planning... " Do you think that people who plan such a vast operation only target ONE (1) package ? 2 years in the planning and they would take chances with a single library ? You can bet that there are dozens (if not hundreds) of packages variably affected.

        1. Doctor Syntax Silver badge

          Re: show how hard this is

          Imagine you were running an APT and you'd already got one exploit in place. Would you risk another? Yes, risk, because it doubles the chances of being noticed and once that happens there'll be a search for other incidents using the same MO and you end up losing your existing one. It's not necessarily a good assumption the same crew would have something else in place. OTOH the same APT might well be engaged on designing something that could be eased into your mobile,whatever make it might be, something like Pegasus.

  11. Tubz Silver badge
    Trollface

    I see Windows sniggering in the corner ... Linux almost owned, defended by Microsoft engineer PMSL !

    1. Doctor Syntax Silver badge

      It's a compression library. How do you know it's not used in Windows?

    2. Anonymous Coward
      Anonymous Coward

      Some flavors of Linux almost owned, versus how many confirmed-exploited Windows bugs?

      I first owned a Windows machine in 1998, and stopped having a Windows machine at home around 2010. I've used various flavors of Linux there ever since. Number of times a Windows box was compromised, despite having up-to-date antivirus: 2. Number of times a Linux box was compromised WITHOUT antivirus: 1, and it was due to a painfully obvious mistake on my part (1).

      (1) I set up an account with username as a common first name, password same as username (user to change upon first login, and they never logged in), and SSH port world-accessible. 2 years later, some script kiddie finally stumbled on it.

      1. Roland6 Silver badge

        Given how many servers and appliances run Linux as opposed to Windows, this attack shows that Linux is sufficiently important now to warrant some serious attention.

        As Linux desktop installs increase, I anticipate we will see more Linux (desktop) exploits. Thus whilst users of Linux can rest easy, I suggest you take this as a warning not to be complacent…

        1. Anonymous Coward
          Anonymous Coward

          The machine exploited above was a server, with public-facing SMTP, HTTP, HTTPS, and SSH ports. The number of exploit attempts was staggering (thousands to tens of thousands per day), despite being a server that almost nobody knew existed. Despite that, the one and only successful exploit was the one I explained above, which was clearly my fault instead of a Linux bug.

          Compared to at least one of the two Windows infections, caused by a dodgy ad on an otherwise legitimate website, which allowed the full system to be hosed, despite active and up-to-date antivirus. (I can't remember how the other one happened, not sure I ever knew the source, but it may have been from installing dodgy software.)

  12. naive

    ......reveals fragility of open source

    As if not Cisco, Apple, MS, AWS, Google and other US based do not get regular emails from three letter agencies with code they have to insert into their stuff.

    I wished people stopped with this brain dead FUD about Open Source.

    We know 100% for sure the code of mainstream US based vendors is riddled with back doors, allowing authorities to protect the children.

    With Open Source we can check the code, and can verify which outgoing traffic it sends. Nobody bothers to check the gigabytes W10 or W11 sends to MS.

    Maybe this reveal is a counter intelligence strike, FSB (KGB) found out CIA had done this and published it to show finger, or vice versa.. who knows.

    1. Sandtitz Silver badge
      Stop

      Re: ......reveals fragility of open source

      "We know 100% for sure"

      No we don't.

      "we can check the code"

      We sure can, but how many are adept enough to understand the code? Some advanced maths in latest ciphers or what have you. Perhaps you can read code but can you prove it does exactly what it says on the label and it cannot even inadvertedly be used in nefarious ways? How long will it take to fully decipher something like the xz code in this case? Is it ANSI C or one of the later versions of C? Or some other language like Fortran? With no comments. With thousands of LOC for obsolete platforms (OpenSSL). Who has the time for this ongoing job?

      How many libraries and other essential code packages are there when a typical Linux distro is compiled, and how many of them are eyeballed for every change?

      Which OSS code have you checked most recently?

      "FSB (KGB) found out CIA had done this and published it to show finger, or vice versa.. who knows"

      This is nonsensical. It was published by a Microsoft employee.

      1. Michael Wojcik Silver badge

        Re: ......reveals fragility of open source

        Once again, OP's handle checks out.

  13. StrangerHereMyself Silver badge

    Scary

    It looks like these campaigns are well thought out, long running and well funded. Most likely there were several different individuals involved active in related projects which could've given them access to more important security critical pieces, like SSH.

    Maintainers shouldn't allow anyone to become a co-maintainer (with write access to the repository) without having seen this individual in-person and having established their identity. For security critical projects and its dependencies maybe government intelligence agencies should be queried to establish the identity of someone requesting access. Yes, software is THAT critical these days.

    We caught this one in time and negated years of work of the adversary, but we may not be so lucky next time. Another thing that struck me is that critical software like SSH has become too bloated and has too many avenues for intrusion. I mean they were using some function callback which could replace the authentication code for SSH on the fly?! Software should be simple and stable. Adding tons of hardly ever used features increases the attack surface and makes it more prone to hijacking.

    This incident does illustrate the power of publicly viewable source code and many eyeballs making it difficult to insert backdoors into software.

    1. Graham Cobb

      Re: Scary

      We caught this one in time and negated years of work of the adversary.

      Yeeessss... sort of...

      We appear to have negated years of work on one particular infection vector. Given that this was years of work, it is extremely unlikely it was a single person, and it is unlikely this was their only bet. Someone was paying their salary and possibly paying a whole team. The person (or the agency they work for) is unlikely to have made their bet just on one approach, which could have been noticed at any time over the last couple of years.

      Who is doing reviews of all the other projects which have had complex, obscure changes which look nothing to do with security but no one really quite understands? I mean XZ for goodness sake??? Who would ever have imagined that could cause every up-to-date Debian Testing system on the internet to be open for root logins for a while? How many more compromises are there out there? We have always assumed the US, Russia and the Chinese each have a horde of vulnerabilities which they can use (and then burn) in case of major war. Was this one of those? Or was it the Norks or Israel or the Iranians wanting their own?

      Who is checking all the obscure libraries used in kernels or security-critical processes by the proprietary vendors (Microsoft, Google, etc)?

    2. thames

      Re: Scary

      StrangerHereMyself said: "For security critical projects and its dependencies maybe government intelligence agencies should be queried to establish the identity of someone requesting access. "

      Er, some government intelligence agency is at the head of the list of suspects in this case. I'm not sure what asking them if one of their employees or contractors should be trusted is supposed to accomplish.

  14. vincent himpe

    Run linux they said...

    It's very secure they said...

    You can see all the source code so you can check for yourself they said...

    Unfortunately so can every miscreant on the planet. That bit they left out...

    Since the repositories are open it's relatively easy for anyone to modify it. That bit they completely overlooked.

    Opening the source is putting the cat next to the milk. Yes you can keep watching the cat...

    Allowing anyone to meddle with it, is spoon feeding the cat.

    I'm all for open source , live and learn, but lock down the modifications hard. In order to be allowed to make a change use a traceable and verifiable id and lots and lots of other barriers. Better check it's your own cat...

    1. whitepines
      Facepalm

      Re: Run linux they said...

      Oh please. Ever hear of Hex-Rays and similar tools? Any person or team with the level of skill needed to pull this off could just as easily change a small bit of assembler in some commercial binary and try to poison the well, in fact it would probably be easier since white hats in general aren't decompiling gigabytes of commercial binaries on a regular basis.

      The only thing that would stop this type of attack is the same thing that would stop the open source attack, namely checking what you are deploying (what the Microsoft engineer apparently did), signing what you chose to trust, and having the system check that signature. Open source *in general* has a lower time to detect the vulnerability, has similar overall attack surface to this sort of thing, but far more importantly gives the end user much more control over other types of malware such as forced data slurpage or sudden removal of key features that are being relied on.

      Say it with me folks: Security by Obscurity is NOT security! Hiding source code and only shipping binary components is Security by Obscurity *by definition*!

      1. doublelayer Silver badge

        Re: Run linux they said...

        I think you're both wrong. They're wrong when they claim that easy modification makes open source worse. You're wrong when you say this:

        "Oh please. Ever hear of Hex-Rays and similar tools? Any person or team with the level of skill needed to pull this off could just as easily change a small bit of assembler in some commercial binary and try to poison the well, in fact it would probably be easier since white hats in general aren't decompiling gigabytes of commercial binaries on a regular basis."

        No, that wouldn't be enough. That gets your exploit in. It is not as easy as putting it in as source code, but you can definitely do it. Now you have a poisoned binary and you do what with it? Unless you somehow manage to replace the canonical one with yours, it's not getting installed everywhere. I can make a poisoned version of Windows, but if I can't put it on Microsoft's servers, it's not getting installed for the general public. This attack had the chance of working because and only because they got their backdoor into the canonical version of the xz source, the one that gets compiled and put into repositories. Putting it into a fork and then waiting for someone to install that fork would do very little. Doing the same to proprietary software isn't any more effective.

  15. jaypyahoo

    OpenSource is not the problem but FOSS should follow development like OpenBSD security focused.

  16. hh121

    I don't think this was a money problem (not that i have anything against that being resolved), this looks more like a verified user problem, following by a who's validating their output problem (or qualified to, or at all), followed by a complex eco system of packages.

    Some rando on the interwebs can get into the chain and what's to stop them wreaking havoc? Damn right there's a chance there are other instances of this out there.

    The big corps like MS, Oracle, Cisco etc might not be perfect (or even close), but they'd be slightly more aware of who their employees are, and who did what, although it probably wouldn't take much to compromise that avenue too. Maybe all they've got is better tracability...

    1. hh121

      I am curious about the down-votes...like how would paying devs for their contributions have helped in this scenario? And how do you know who you can trust as a contributor? Or who you can trust putting together yet another distro (which I've flagged before on other threads), let alone the chain of packages that may or may not get included, SystemD or otherwise? They can't even figure out who this contributor is (are) or where they are, let alone whether they can be trusted (not)...seems to be a pretty fundamental problem in the whole community approach to me. Considering the hoops I have to jump through to get a bank account or phone service, perhaps the bar to entry is (way) too low for something with this level of impact.

    2. collinsl Silver badge

      > Some rando on the interwebs can get into the chain

      They can fork the code and make their own version, sure, but they can't force people to use the new code in their own projects or replace the one the original maintainers are providing.

      In order to alter the existing code base they'd have to get the existing maintainers to accept their branch or merge request etc, which no sane maintainer would do unless they've reviewed the code & verified that they're happy with it.

      1. hh121

        I completely agree with your logic, but from my point of view the maintainers are randos in the interwebs as well, let alone the volume of submissions and dependencies they are presumably dealing with. Given the number of these things it seems like a bigger issue. https://en.m.wikipedia.org/wiki/List_of_Linux_distributions

        But even if you accept the primary maintainers are the good guys, how many more weak points are there with a package maintained by one person who can be socially engineered off. Or a package submitted by a baddie that was initially perfectly clean and good, but they have a longer game in mind. It's still all a trust thing.

  17. Anonymous Coward
    Anonymous Coward

    It seems to me that one of the things that made xz a good candidate for this kind of attempt is to be the type of library it is (compress/decompress).

    Binary "test files" are somehow more reasonable to include than they would be in other libraries, and almost impossible to code review (although there might have been some other flags that could have been tripped, I don't know).

    Perhaps this could help classifying certain kinds of open source packages as "high risk" - ones with binary files and/or complicated-looking build macros should definitely be gone through with a fine toothed comb.

  18. bobd64

    Not just open source

    This took a three year long social engineering attack. This was almost certainly a state actor.

    They will be making just as much effort to get their backdoors into commercial software. Just how well vetted are the employees and contractors working in commercial development? The attack will be different, but just as much effort will be applied.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not just open source

      Yep, I said in another forum the amount of co-ordination and planning to steadily and stealithly put together the pieces over such an extended period of time plus run interference against detection called for a significant amount of resource, time and money.

      1. Claptrap314 Silver badge

        Re: Not just open source

        And brainpower. While in theory, as mentioned, this sort of thing has been well known at least since On Trusting Trust (as mentioned above), the work of actually identifying an exploitable weak link was itself something many (most?) would fail to do.

  19. midgepad

    Many eyes...

    And one found it, and mentioned it.

    Whether any found it and stayed quiet, banked it, we don't know. If so, they've been frustrated, also.

    Theres a paradox about finding such a fault, attack, crime etc.

    Once found and announced, it has been found. It can't be found for the first time again, snd yet we are told that being found only by one person shows a system isn't working.

  20. MSArm

    Delicious irony

    That a Microsoft employee found this looking at open source code. That's one in the eye for all the Linux fanbois.

    +1 to Microsoft

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like