back to article Open sourcerers say suspected xz-style attacks continue to target maintainers

Open source groups are warning the community about a wave of ongoing attacks targeting project maintainers similar to those that led to the recent attempted backdooring of a core Linux library. Higher-ups at the OpenJS Foundation and Open Source Security Foundation (OpenSSF) believe the attempt to plant a backdoor into Linux's …

  1. Brewster's Angle Grinder Silver badge

    "Interactions that create self-doubt, feelings of inadequacy, of not doing enough for the project, etc. might be part of a social engineering attack."

    Have you had any experience working on open source projects? Lots of people are professional. But there are far more requests meeting the above criteria than can be reasonably explained as supply chain attacks.

    1. Throatwarbler Mangrove Silver badge

      May it be time for, dare I say it, nerds to learn social skills?

      1. doublelayer Silver badge

        It's not even just the other developers. End users can be pretty demanding as well. It doesn't take that many of them for those doing the work to start getting annoyed at their users and withdrawing from user-focused channels. Nor is it at all unique to open source software, but there's a bit better of an argument for customers of a company to feel they deserve the attention of that company's staff. I don't think we'll ever find a solution to that, but I'm all ears if someone else has one.

      2. sabroni Silver badge

        re: May it be time for, dare I say it, nerds to learn social skills?

        Judging by the downvotes I think they need to work on their sense of humour first.

      3. MyffyW Silver badge

        Isn't a nerd with social skills actually a geek? [says the self-identifying girl-geek]

        1. devin3782

          Nah it's decided purely on whether or not they've played D&D

  2. StrangerHereMyself Silver badge


    We shouldn't depend on some security critical component made by a lonely maintainer somewhere. These components should all be transferred to larger open-source organizations like Apache, Debian, Canonical or the Linux Foundation.

    All open-source software is vulnerable since you're running code on someone's machine. Even a simple clock application could be backdoored to provide remote access. And this could be done covertly if the build process is altered. The source code would still look pristine.

    1. Gene Cash Silver badge

      Re: Dependency

      > Apache, Debian, Canonical or the Linux Foundation

      F*ck that. Those people suck. I might work with Apache, but the other 3 are dicks.

      1. Anonymous Coward
        Anonymous Coward

        Re: Dependency

        Log4j is an Apache project….

    2. R Soul Silver badge

      Re: Dependency

      "We shouldn't depend on some security critical component made by a lonely maintainer somewhere. These components should all be transferred to larger open-source organizations like Apache, Debian, Canonical or the Linux Foundation."

      [citation needed]

      Where's the proof any of the above could be better? Three of your choices are poisoning the world with the cancerous trainwreck and gaping security hole known as systemd. IMO, they're unfit to be trusted with anything.

      By all means take security critical software away from a lonely maintainer, assuming there is such software in that position. Though there needs to be great care taken over who the replacement is: resourcing, skills/experience, testing procedures, audits, transparency, stability, openness, etc.

      1. doublelayer Silver badge

        Re: Dependency

        Just checking, I see why you blame Debian for adopting systemd. I can even see why you include Canonical, even though their use of systemd is mostly based on Debian doing so. But which of the other two are you blaming for it and why?

        There's another question of why choosing to use systemd makes them ineligible to do anything else in your opinion, but I'm not sure it's a discussion for which we'll find a common starting point.

        1. boblongii

          Re: Dependency

          The issue here is that Debian did not create systemd but approved its use even though the fundamental security problems in the design - not just the implementation - were there for all to see. That brings into question the value of their judgement as to what constitutes a safe system in general.

        2. R Soul Silver badge

          Re: Dependency

          Since you seem blind to appalling defects and fundamental flaws in the systemd POS I agree won't find a common starting point for a discussion.

          Your question is answered in boblongi's post below. It's so good it deserves repeating in full: The issue here is that Debian did not create systemd but approved its use even though the fundamental security problems in the design - not just the implementation - were there for all to see. That brings into question the value of their judgement as to what constitutes a safe system in general.

          Come to think of it, that posting should be included in every article about systemd and the rag-bag of satan's little helpers who enable that almighty POS.

          1. doublelayer Silver badge

            Re: Dependency

            I note that you didn't answer my first and larger question: which of Apache or the Linux Foundation are you also blaming and why?

            I also note that I said nothing about my opinions of systemd, but you have jumped to conclusions that I have no objections, then jumped to another conclusion that I "seem blind to appalling defects and fundamental flaws". I may oppose systemd's use as well, but by deciding based on no evidence, you confirm that we cannot actually discuss the related but not identical issue of whether choosing to use it should be a blight on an entire organization rather than, for example, the group that chose it alone (many Debian developers work on something unrelated). If you insist on seeing any questions as opposing points you didn't even make, a discussion seems precluded. You would still be capable of answering question number 1, though.

            1. R Soul Silver badge

              Re: Dependency

              "which of Apache or the Linux Foundation are you also blaming and why?"

              The Linux Foundation, obviously. They're systemd fanbois. Ot at best they're not doing anything of significance to stop that POS from spreading. Apache isn't tainted by systemd AFAIK. [Well, not yet. I suppose it's only a question of time until everything in the open source world gets infected with the systemd cancer.] Mind you, Apache's record on security engineering leaves room for improvement: log4j for instance.

              Your earlier posting seemed to defend Canonical and debian for their systemd love. That suggested to me you were a systemd fanboi. If you're not, I apologise for that false assumption. If you are, that apology is withdrawn. Either way, I see no need to continue this discussion.

      2. ldo

        Re: Three of your choices are poisoning the world

        This is why Open Source is all about choice.

    3. tinpinion

      Re: Dependency

      It would be trivial for larger open-source organizations to create stable downstream versions of these codebases. I'm pretty sure quite a few large open-source-using companies (Google, Facebook, Microsoft) maintain internal forks of open-source software that they rely on. Upstream can continue putting out new and exciting changes, and a stable version which only receives reviewed updates can be maintained by someone with a bit more heft.

      You could even package those stable versions together and make it easy for users to install through some kind of software installing application. Since they're stable versions, you could even precompile the code and skip both the impact of downloading the source and compiling it! Like, hear me out here: you could run a command like... app-get xz-utils, and it would just install a reviewed copy of xz. You could manage your entire environment with a tool like that!

      In order for this all to work, however, the open-source organizations would need to actually review changes rather than simply precompiling and distributing them as they come in. Maybe we should grant the management of all open-source software to Microsoft instead. They're super-interested in open-source and even own GitHub!

    4. Rapier

      Re: Dependency

      Hell no. First off, you would be creating a system where open source work is stifled. Without the imprimatur of one of these groups no one will even look at alternatives. This creates lock-in and single points of failure. Second, these groups lack the resources to adopt the vast array of necessary open source projects. Meaning that they'd still rely on original maintainers and volunteers to do all the work.

      OSS devs need to adopt better supply chain security measures but they also need support from their users. By support I mean money. Corporations, businesses, and people that are dependent on OSS dev efforts should be willing to materially contribute to those efforts.

  3. Someone Else Silver badge

    From the article:

    Suspected attackers were trying to get themselves added as project maintainers to "address any critical vulnerabilities," but didn't provide details on what these vulnerabilities were, which already sounds fishy.

    Connor, I saw what you did there...

  4. Locomotion69

    Bottom line: in the end it is the human that cannot be trusted.

    1. Will Godfrey Silver badge

      True... but which human?

  5. Claptrap314 Silver badge

    The deeper issues

    1) A substantial part of the supply chain risk is the sheer volume of code that gets pulled in. When our (very) modest website pulls in 50K packages, who can possible argue with a straight face that the code is secure?

    2) Another core issue is that very few programmers understand even the most basic matters of security.

    Combine these, and what do you get? "Hey, I heard good things about package X. We can deliver the code 50% faster if we use it." And nobody asks what is happening to the attack surface of the project when that package gets brought in.

    And the package maintainer of X is doing the exact same thing, because he also doesn't understand security.

    Security, like simplicity, is *hard*. And, not fun. And, few people praise you for doing that instead of adding a feature.

    At least with open source, there is a recovery path when things go horribly wrong.

    1. theOtherJT Silver badge

      Re: The deeper issues

      It's particularly hard to get this point across to developers who have never had to work in restricted environments.

      A lot of us commentards are probably old enough to remember how precious every Kb of memory was and that you absolutely had to get your code to fit in X bytes at run time and that it could only be Y bytes on disk because there just wasn't enough space to do anything else. If your program wouldn't run in 640k of ram, then it wouldn't run, period. If it wouldn't fit on an 800K floppy, then you couldn't ship it because splitting it over two disks would massively increase production cost.

      That's just not how a lot of current developers have ever had to work. They graduated from their software engineering degrees at a time when memory and disk space were already effectively infinite and have now got a decade of career experience behind them that's only re-enforced that fact - especially if they've been working "in the cloud" all that time, where you can just keep pulling more and more resource to make up for inefficient code.

      In that world no one thinks it's weird that a single application might have 50 dependencies, which each have their own dependencies, or that the driver that changes the colour of lights on your mouse is 150M for some reason.

      1. Anonymous Coward
        Anonymous Coward

        Re: The deeper issues

        The deeper issue might be resolved by a "shallow" requirement limiting the total depth of dependencies in a project. For example 3 levels: project, middle-ware, primitive. This still allows for common libraries. To the extent that more projects follow that criteria, the open source libraries would change in following the drive to survive. It would help if top down orders from open source consuming corporation execs were coerced into following such guidelines via increased liability for security breaches if they don't - i.e, punishing shareholder returns.

        1. Claptrap314 Silver badge

          Re: The deeper issues

          That's not a bad idea, in theory. In practice? "Every problem in computer science can be solved by adding a layer of abstraction--except for too many layers of abstraction".

          When I'm working as a SWE (and not as a SWE in OPs), I am constantly pulling out code into modules (as a rubist, than means "gems") to be used in other pieces of work. After a few years of that, your internal systems will end up with more than three layers. And every SWE supporting a library you are using is doing the same thing.

          I credit you for coming up with the idea. But in practice, such a rule really doesn't get there. Which is a horrible shame.

    2. BinkyTheMagicPaperclip Silver badge

      Re: The deeper issues

      It's true that security as an overall discipline is hard, but some things are really easy, the important issue is businesses and developers don't like the answer

      Pulling in 50K packages, any of which could have an exploit? Your entire design is broken. Want to fix it? You should be obtaining with Actual Real Money or developer code review of each new change, a set of libraries that are very likely to be secure. These are the only libraries that you use. That also means as a logical consequence you're probably not following the bleeding edge, as developer time to review everything is expensive and slow.

      You're *definitely* not automatically just pulling the latest version of things from the public web for builds or systems, either[1], it's all hosted locally. If the design 'needs' to automatically pull the latest version of a component to build, again, it is fundamentally broken.

      So the question is how secure things need to be, and where money is involved the natural answer is 'not very'.

      I realise this becomes difficult or expensive, especially when open source is involved, but it is abundantly clear most people only care about speed of development and security is a vague afterthought.

      [1] Unless it's Saas, which is certified and again, you are paying for it to be maintained and secure

  6. Anonymous Coward
    Anonymous Coward

    New maintainer criteria

    "A new maintainer's code being intentionally obfuscated or difficult to understand could be a giveaway that they're trying to hide something without being detected."

    I would like to think that before a new maintainer is added, their previous submissions are reviewed for readability, style etc to make sure they know what they are doing before trusting them to accept other's code in future?

    1. doublelayer Silver badge

      Re: New maintainer criteria

      I think that meant to say that the maintainer, before being added, had submitted clean code and was therefore accepted. You still need to review the changes they make after being added as a maintainer to see if they switch. It's easy to write cleanly when you're not hiding something and just getting yourself into the position to be able to submit something dangerous, and it's impossible to determine between someone doing that or someone just trying to submit good code until the time they submit something bad.

  7. Anonymous Coward
    Anonymous Coward

    The xz issue was found by sheer luck by one developer who decided to look into a performance issue. With this report saying the issue of bad actors weaseling into projects has been observed happening elsewhere, would it be safe to assume that other projects have already been compromised but nobody has noticed. Nation state bad actors don't need to find vulnerabilities if they are planting them.

    The "Many eyes makes secure code" theory only works if any eyes are actually looking at the code. I am guessing that most people using these projects don't look at them, since they assume others are looking at them to find problems. Since this is not something that thrills most people and doesn't pay the bills in most cases it just isn't happening. Yet people assume it is. You are then down to the 'one guy in Nebraska" maintainer to find problems in his spare time.

    It has been a while since it was a developer. While I would use code I found online, I would read what it does first and implement it in my own code. The thought of just pulling in random libraries along with all their dependencies fills me with horror.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like