back to article 'Trust no one' is good enough for the X Files but not for software devs: How do you use third-party libs and stay secure, experts mull on stage

In a chilly conference room at the San Francisco's Hyatt Regency on Monday, legal and digital security pros convened at USENIX's Enigma conference to hold forth on security, privacy, and related matters. Following a discussion panel on encrypted messaging, the talk turned to mitigating the risks that come with using third- …

  1. tony2heads

    Unless

    you write the OS, compilers and libraries you have to trust someone.

    It is easiest to check if the codebase is open to inspection.

    1. Charles 9

      Re: Unless

      Even then, what about the chips they run on?

      1. Claptrap314 Silver badge

        Re: Unless

        In the Western world, at least, the number of people involved in architecting, designing and validating microprocessors (at least, at AMD, IBM, and Intel) is simply too large for any deliberate back dooring of these chips to be hidden.

        If you think that the software security folks sound like extremists on this account, try talking to the hardware folks.

        The engineers are very, very serious about meeting the specs, including the published security models. (And the same goes for at least the bottom couple of layers of management.)

        I've seen a line stoppage caused by a power line coupling to an address line. The tolerances in the system are too tight. They have a hard enough time meeting that target without trying to slip in some idiotic back door system.

        1. stiine Silver badge
          Unhappy

          Re: Unless

          Please explain Intel's use of Minix, since they were one of your examples above.

          Also, how can we, the consumers, tell the difference between malicious code and simply bad code, because to us, there's not really any difference.

          1. Claptrap314 Silver badge

            Re: Unless

            Minix? Is that one of those high-level things you call an operating system? We were discussing slipping surveillance capabilities into a CPU. I make no representations regarding software.

        2. Charles 9

          Re: Unless

          I would think that would make it easier, not harder, as a sneaky saboteur can just hide among the noise, concealing things by scattering them among more-useful features and make the exploit by gestfault. I mean, it's not like there's someone poring over every little connection, is there? Probably not given the scram example you cited that slipped through. You say, "Many eyes...", I say, "Too many cooks..."

          1. Claptrap314 Silver badge

            Re: Unless

            Remember that analyst report a few months back about a motherboard manufacture supposedly slipping in a chip the size of a grain of rice onto a motherboard that could snoop traffic & phone it home? EVERYONE who had worked at that level knew it was crocked.

            This fear is another iteration of that same problem. Internally, microprocessors are almost exactly like motherboards, except for the scales and relative sizes of the components are different. All of the problems regarding bus size, bandwidth, performance, and, yes, validation, applies. You try to do something sneaky, and you are going to kill performance of that part of the system. You are going to add load to the power and clock lines.

            Even if there is one "senior architect" for the project who theoretically could modify the spec appropriately, they do not work on their own. They have a team that they work with--and EVERYONE on that team is going to see the big, flashing, "SEEKRIT--DON'T LOOK" sign. The permanent damage to the company reputation when word gets out is too much for even an SVP to ignore.

            Notice I mentioned the folks that design their own chips. An SoC vendor typically uses macros from another supplier. Those macros are black boxes.

  2. Doctor Syntax Silver badge

    "And as long as the license of a certain module allows for distribution, we will hold on to the contents so that even if they get deleted, they will still be available for you to build."

    But what if the author wishes to revoke a particular version because of a newly discovered vulnerability?

    1. Throatwarbler Mangrove Silver badge
      Holmes

      Deprecate the old version and release the new one with release notes indicating why you shouldn't use the old one?

      1. Anonymous Coward
        Anonymous Coward

        Which promptly get ignored and still leave you in potential legal trouble. What next?

        1. bombastic bob Silver badge
          Devil

          make sure the license *SPECIFICALLY* relieves the author from ANY liability

          "AS IS" etc.

          1. Anonymous Coward
            Anonymous Coward

            They sue anyway under "Fit for Purpose" laws which can override an "as is" declaration?

    2. Christian Berger

      There actually was a talk about this problem at the 36c3. The proposed idea for a solution against that was to mark your library "Geek-Code"-style to indicate if you see it fit for use for security critical things.

      A typical example would be a crypto library someone started because they wanted to experiment with it. Of course one could use it for serious things, however since it wasn't meant for that there could be serious issues with this. Nevertheless releasing such code may be beneficial for some as a demonstration device.

      1. Korev Silver badge

        Would it end up like the situation with medical software & lab instruments which have "Not for diagnostic purpose" written all over them so the vendor can avoid the hassle of getting them "Validated".

        1. Paul Cooper

          Or navigation software that displays a "Not for Navigation" notice!

      2. Claptrap314 Silver badge

        Don't even do that. You can publish it with out licensing. And by that, I mean you explicitly state in the LICENSE file that personal approval is required to run the code.

        If you state that you are doing it because you don't trust the code, and you don't want it running without a proper security review, it will be clear what kind of person you are...

  3. Christian Berger

    It's actually not that hard

    Although you can never be 100% safe, you can always lower your risk by lowering your dependencies.

    For example, if you have a simple list, using XML or JSON adds complexity without providing value. If you use simple delimiter separated files you can often use standard library features to parse such a list.

    Beware of environments where adding a new dependency is simple. Adding a dependency is a potentially dangerous thing to do, think before you do it, think before pulling in code that adds new dependencies.

    1. Loyal Commenter Silver badge

      Re: It's actually not that hard

      If you use simple delimiter separated files you can often use standard library features to parse such a list.

      Are you suggesting using CSVs because they are "standard"?

      OK then, why don't you show me the standards document for how a "delimiter separated list" should be structured. How does this handle quoting? How does it handle escapes? Which "standard library features" are used to parse these files? Are they platform / implementation specific?

      XML, on the other hand, whilst being relatively heavy-weight has the concept of a schema, so you can define the structure of the data, and has escaping.

      JSON is lighter-weight, and is well standardised, including quoting and escaping. It's also relatively human-readable (arguably more so than a simple delimited file).

      Each has its own advantages, and drawbacks, and each is the right tool for certain jobs. To someone who only owns a hammer, everything looks like a nail...

      1. Christian Berger

        Re: It's actually not that hard

        "Are you suggesting using CSVs because they are "standard"?"

        No of course not, I'm suggesting that because in 99% of the cases it can be done in a very simple way. Often you don't need the ability to have the delimiter character in your data fields, you can simply replace it with another character or reject that input as invalid.

        For example if you just have nummerical values, scanf can easily read that for you. With slightly more effort it can also read space delimited colums of strings.

        Even if you need arbitrary data, there are way simpler ways then the "Windows CSV". Just use no quoting and add an escape character. That way your parser only needs to read in the input character by character and only have 2 modes. The first is the normal mode, the second is the "after escape character" mode.

        One of the worst examples for how you can mess up a simple format is probably the "Windows CSV" which adds things like quoting which makes parsing very hard.

        XML and JSON may have their advantages for complex and dynamic data structures. However one rarely needs that. Relying on standards is not always a good idea, particulary when you need more code to use a separate library than an implementation of your own parser would need.

        A good summary of the state of the art is here:

        http://www.catb.org/~esr/writings/taoup/html/ch05s02.html

        1. Loyal Commenter Silver badge

          Re: It's actually not that hard

          I'm not arguing that delimited files aren't the right solution for certain jobs. The point I was trying to make is that simple solutions to quick-and-dirty problems aren't always the most robust. The reason things like XML and JSON are more "wordy" than a delimited file (or even a fixed width SDF) is that they often have considerations built in to deal with the real-life problems that come from users being able to mash their keyboards and come up with all sorts of weird and wonderful input.

          If you're talking about files that come from one automated process and get pushed into another, then firstly, are you making the right choice by putting the data into a "file" in the first place. If you're talking about some sort of IPC, or data structures used internally to a process, why are you even thinking about what the data would look like in a file? Do you seriously need to write it to a file system (and deal with all the exceptions that can raise?)

          Context is the key here; if you aren't writing stuff into a file at all, and, for instance, pushing it to an API somewhere, I'm going to assume you are using modern methodologies to do so, and that API is RESTful. Why the hell would you not want to use JSON for that? There might be some niche considerations where you want to minimize the payload for performance optimisation, in which case, is a delimited format appropriate either? And just because you don't think you're going to need to escape things, doesn't mean that the requirements won't change in future to mean that you will?

          In my professional experience, there are very few situations where something needs to be output to a CSV (not least because some idiot is going to go and open them with Excel and then save them again all reformatted). If it needs to be machine-readable, use something machine-readable, even if it's a chunk of binary in a proprietary format. If it needs to be human and machine readable, use JSON. If it needs to follow a complex schema, maybe use XML.

          Of course, it all depends on your use case, but use the right tool for the job. There are situations where you need to handle data that is supplied in a certain format, that is out of your control. I'm not sure I've come across any problem in the last 15 years or so where the solution was to output something to a CSV and then read it back in again.

      2. bombastic bob Silver badge
        Unhappy

        Re: It's actually not that hard

        I wouldn't call JSON "lighter weight" ...

        how about just using a format that YOU like? I like tab-delimited files for simple columnar data. arbitrary white space delimited works for utilities like 'awk', and so a tab-delimited file can often be passed through awk. It also works when pasting into a spreadsheet, and (mostly) deals with text data that has embedded spaces in it.

        Or if you don't like that, use something else. The only real point here is "a file format" that is standard enough for data interchange, so think about how it's gonna be used, and pick a format.

        1. Loyal Commenter Silver badge

          Re: It's actually not that hard

          I wouldn't call JSON "lighter weight" ...

          Maybe I was being a little ambiguous; I meant lighter-weight compared to XML. I don't think anyone could argue the case that

          "name":"value"
          is wordier than
          <name>value<\name>
          and that's without getting involved with schemas and DTDs.

      3. Nafesy
        Thumb Down

        Re: It's actually not that hard

        Um, that's awful advice! (and yes, its actually very hard!)

        How likely is it that Joe Bloggs will successfully mitigate formatting/injection/overflow attacks by rolling their own parser? (The answer is ZERO!) And yes, csv is vulnerable to macro injection attacks.

        I would never attempt this - and I've been coding professionally for twenty years, and over 30 years in total.

        Just don't!!

    2. Claptrap314 Silver badge

      Re: It's actually not that hard

      I think your instincts are good, but perhaps your experience is narrow.

      Yes, there has been a colorful history of complex formats leading to security issues. However, at least one defense is to use a well-vetted library to do the serializing & deserializing. In that regard, JSON is the standard choice precisely because such libraries are available, well, everywhere.

      Parsing fails inevitably occur around corner cases. These cases typically occur either where the original design failed to account for something, or where the original design was found to be lacking, and the modification to accommodate the new requirement violated an assumption in the original design that was either forgotten or never communicated.

      Homespun solutions, including scanf, start out looking great until it becomes necessary to accommodate some new class of data.

      After the Rails YAML fiasco, (https://www.theregister.co.uk/2013/01/10/ruby_on_rails_security_vuln/) I've put a LOT of thought into a general, efficient, language independent, safe object serializer. I can assure you, it is hard.

      1. bombastic bob Silver badge
        Facepalm

        Re: It's actually not that hard

        I really don't like JSON for a number of reasons. Mostly it's just CUMBERSOME. What people have done with it in Python (as one example) and Java Script is (in many cases) facepalm-worthy. XML isn't much better, and can be JUST as hideous. When I ask what I'd RATHER WRITE A PARSER for [_NOT_ using someone's canned inefficient license-encumbered crap-library I could re-write myself in a few hours] I prefer XML, but only just barely.

        And if MY code relies on something that can be updated by someone OUTSIDE OF MY CONTROL, then it NEEDS A RE-WRITE to AVOID that. _I_ do not want the "midnight panic phone call" from a customer that was caused by SOMEONE ELSE "being an idiot" [or installing some game or crappy utility that overwrote a shared lib, etc.].

        But this whole thing about shared libs and BLINDLY RELYING ON OTHERS (particularly on teh intarwebs) has gotten a bit out of control...

        Back in the 90's, Microsoft *ENCOURAGED* us all to use shared libs for C runtime and MFC and whatnot. These libs were even included with the OS. I fell for it. THEN, an application I was working on for a customer would not run in Win '95 OSR2. Turns out *THEY* *POOCH* *SCREWED* the MFC libs' ODBC implementation by CHANGING THE ABI [and didn't bother to tell anyone]. So Win '95 MFC libs and Win32s MFC libs worked fine. Win '95 OSR2, followed by WIn 98 [and others] did _NOT_. [and 'just upgrading the libs' with 'newer versions' wasn't an option - remember DLL HELL ??? Win32s was involved, here. You couldn't "just do that", end-users were running Win 3.11 or '95 and the application needed to support that]

        My solution: re-compile MFC from source as "differently named" shared libs and dynamically link to those instead [since the solution I was going wtih *REQUIRED* the shared MFC]. For every (windows) project SINCE then, I statically link *EVERYTHING*. *PERIOD*. (For open source, I just ship source and use the OS default when it's built)

        YMMV of course, if you have cloud-based code and just want to use someone's "simple thing" implementation, but you should at LEAST consider HOSTING IT YOURSELF to avoid "THIRD PARTY UPDATE HELL". Or do you *LIKE* getting "the midnight panic call" from customers?

        1. Nafesy
          Holmes

          Re: It's actually not that hard

          "And if MY code relies on something that can be updated by someone OUTSIDE OF MY CONTROL, then it NEEDS A RE-WRITE to AVOID that."

          Wow, you maintain your own OS?

          Seriously though, the only calls that you'll be experiencing are from new devs tearing their hair out and wondering why they're having to deal with your home brew nonsense when it could have been done better, more reliably and in less time with a well researched library choice.

        2. Loyal Commenter Silver badge

          Re: It's actually not that hard

          I really don't like JSON for a number of reasons. Mostly it's just CUMBERSOME. What people have done with it in Python (as one example) and Java Script is (in many cases) facepalm-worthy. XML isn't much better, and can be JUST as hideous. When I ask what I'd RATHER WRITE A PARSER for [_NOT_ using someone's canned inefficient license-encumbered crap-library I could re-write myself in a few hours] I prefer XML, but only just barely.

          It sounds to me that your problem isn't JSON, but the proliferation of third-party libraries built to handle it. There are plenty of programming languages with either built-in support for JSON, or with "official" libraries, or reference implementations of serialisers / deserialisers. JSON is, at the end of the day, just a way of formatting data so that it can be easily passed around as a string in a platform-independent way.

          As for blaming JSON for the state Javascript is in. Well, I don't think the problem there is JSON, it's the origins and history of Javascript itself that is the problem there. Which is, for example, why typescript exists. I'm not going to argue for a moment that relying on a bunch of third-party JS libraries to do your work for you is ever going to be a good idea. Sadly, it happens that a lot of programmers think they can get away with it, which is why half of the internet is held together with sticky tape and string.

          1. Charles 9

            Re: It's actually not that hard

            "It sounds to me that your problem isn't JSON, but the proliferation of third-party libraries built to handle it. There are plenty of programming languages with either built-in support for JSON, or with "official" libraries, or reference implementations of serialisers / deserialisers. JSON is, at the end of the day, just a way of formatting data so that it can be easily passed around as a string in a platform-independent way."

            It's that last part that gets you. No one can ever agree on most things: not even what counts as a "reference" implementation.

            Frankly, the only way to be sure your code works consistently is to not rely on outside control. Or rather, if you must, lock it in place such as with an internal copy or static linking (thus the need for flatpacks and other implementations of multiple conflicting versions of the same library). IOW, move away from global dependencies and keep them local in scope. Close to the vest is the safest way to avoid a switcheroo.

        3. RyokuMas
          FAIL

          Re: It's actually not that hard

          So let me get this straight...

          You would rather write your own code for a purpose than use something that already exists.

          You reckon your code is bulletproof and that unit tests aren't neccessary.

          You blindly hate change (insert usual slew of references to ".Not", "2D interfaces" etc. here).

          You think how people feel is irrelevant, despite repeatedly expressing strong feelings yourself.

          You believe that anyone who disagrees with you is a "howler monkey", and present your opinions using a range of emphases that attempt to ram them down the throats of others (again, despite stating your own dislike for those that do exactly this) instead of engaging in calm, rational and meaningful debate.

          I sincerely hope for your own sake that this is all an "internet persona". If not, I would strongly suggest that you read this, and give it some serious consideration before you piss off someone who can sack you.

          1. Charles 9

            Re: It's actually not that hard

            "You believe that anyone who disagrees with you is a "howler monkey", and present your opinions using a range of emphases that attempt to ram them down the throats of others (again, despite stating your own dislike for those that do exactly this) instead of engaging in calm, rational and meaningful debate."

            Or perhaps one's speaking from the trenches and from firsthand experience. If someone's personally been to No Man's Land, they're going to have a different perspective, especially if one's been personally betrayed in the past. It's all a matter of trust. The article is saying we have to have SOME level of trust to get things done, but he disagrees on the exact level. Me? I'm reminded of the phrase, "If you want something done right..."

  4. karlkarl Silver badge

    "We all use other people's code"

    ... yeah but some developers use more than others XD

    Get rid of them and the digital world will instantly be safer harhar

  5. DWRandolph

    A good fire starter, loosely related to the main topic:

    CI/CD = Perpetual Beta

    Not all, or even most, CD'ish is bad. But enough has been broken by someone else updating in the chain to make me rather wary!

    1. Claptrap314 Silver badge

      Perpetual Beta is the reality of the vast majority of code. If CI/CD makes that more obvious to you, then good.

      PB is why maintainability is THE most important measure of code quality, by the way.

  6. TheSkunkyMonk

    Im still upset they reinstated the guys code on the NPM without his permission, so what if it broke a lot of things he made a choice and didn't want to use their platform any more, for good reasons as well. Guess thats the joys of licensing.

  7. Version 1.0 Silver badge
    Happy

    You buy the source code.

    A company I worked for in the early 80's got a nice order from the US Government, they ordered the system we were selling and required (and paid for) the source code for the operating system, our system, and a FORTRAN compiler too - the compiler source code alone added about $60k to the order.

    1. Loyal Commenter Silver badge

      Re: You buy the source code.

      I hope they trusted the compiler they compiled the FORTRAN compiler with, and it wasn't provided by Ken Thompson...

      1. Anonymous Coward
        Anonymous Coward

        Re: You buy the source code.

        Given this was the early 1980's, I think they would've have a better chance to have things checked down to machine code at least.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like