back to article Linus Torvalds pines for header file fix but releases Linux 5.8 anyway

Linus Torvalds has released a new version of the Linux kernel. The meta-maintainer last week pondered an eighth release candidate for Linux 5.8, but on Sunday decided “it's not just worth waiting another week when there aren't any big looming worries around.” Torvalds did encounter “annoying noise with header file …

  1. Lee D Silver badge

    I love programming in C (99) but I have to say that one of the things I'd throw out is everything that leads to header file dependency problems.

    If I #include a file... I've included it. If it was already included, then I don't need to include it again. But, no, you have to put magic #ifdef's through 50 files to stop that happening.

    If I #include a file, and it has a structure or other definition in it... then that structure of definition is defined. I shouldn't have to play games so that it is prototyped in files to let them know it's defined (even though they think it isn't yet) so that when it is actually defined, they know where it is (and god-forbid you define it twice).

    Then the real magic comes when you want to include structures inside structures, when you want nothing more than a pointer to a structure inside a structure that you haven't defined yet (you don't need to know what type it is yet... it's just a pointer!), so you end up playing games with void pointers that then get cast to the proper type.

    Then you have things like you miss off the include in one file - while processing 50 files simultaneously. There's no "I think you meant this, and I can't see any conflicting definitions" logic. Then you have "Do I define this structure before I include that other file, because it needs it, or do I do it after? Do I mess up my nice list of top-of-file includes just to resolve a header dependency issue?"

    I actually kept a screenshot I got while coding a game with a copy of GCC - the whole code compiled except for ONE error. It said precisely this:

    expected 'struct Player *' but argument is to type 'struct Player *'

    Now, it doesn't matter how many times you read that, it's nonsense. It wasn't a typo, or similarly named but different structures, it wasn't a formatting error - the "struct Player *" in that error line is 100%, absolutely, totally identical (I know, I compared them in a text editor!) and there's only a single definition of that structure in all the code. It came about from some weird header dependency where it thought it had a perfectly defined struct Player, but had somehow managed to include the file twice, without it realising that it was the same file/structure inside. So instead of warning about duplicate definitions, the original "player.c/h" files believed they had the definition, but the function they were used in somehow included them via a roundabout route that thought that *it's* inclusion was the correct definition. And the compiler had obviously given the same structure, only ever defined in only one file, two different "internal" names that it thought were different.

    It's 2020. I understand the *legacy* of things like C header file inclusion, the preprocessor, etc. but I do still think it's about time that we got a compiler switch which basically means "just sort it out". If you don't have a definition, hold onto that thought and come back to it later in the compile. If you have double definitions but they're from the same file or the structure is identical, maybe warn but carry on regardless. If you need the size of a structure pointer - it's a pointer. Put it in as a pointer. Worry about the structure inside it later. If I #include but the file is in a slightly different sub folder (e.g. #include "SDL/SDL.h"), work it out... warn me or let me choose as necessary. "Did you mean SDL2/SDL.h? Would you like me to modify the include line to reflect this in the future?".

    There's no reason that a C compiler, even in "C99 mode", can't take the whole compile as a process, and colourise parts as it goes, leaving unknowns in a grey area until it can determine what was actually meant. Then filling in the gaps later piecemeal (not on a file-by-file basis as that would just leave you in the same position because it can't resolve the entire file, so nothing else is "ready" yet either). Then warning/asking the user exactly what they want to do.

    Especially with things like preprocessor file paths (is it SDL.h or SDL/SDL.h?), it could just search, and tell you, because those things might well change from every different computer that code is compiled on. Work it out, present options, because those options are only ever going to affect me anyway, if I'm the one with the weird-arse include layout.

    The problem with C is that it's a fabulous language to program in, but far too much time is spent faffing getting compilers, linkers, Makefiles (YURK! Why are we still using that junk?), etc. to work. I use it on several platforms, with several different architecture targets on each, with all kinds of libraries, and in several ways (command line, Makefile, CMake, Eclipse IDE, etc.) and it's the damn setup that takes most of the time to make sure that the libraries are linked when you use their headers, that they link in the right order to fulfill the stupid compiler's demands, that header inclusion takes place in the right way, etc.

    It should just be... compile main.c - it includes a bunch of files, we know they are there in the same folder, pick them up and use them if we've obviously intended to, inform the user of what you did, link it with the libraries that included those headers (it's really not hard to have a #pragma or similar for this... if you include SDL.h, link against libSDL, if you include SDL_Mixer.h, link against libSDLmixer... it would take one line in the header!).

    I get the reasoning, I know there's the chance of including the wrong version header, library or whatever, but that's not unresolvable (especially if the compiler/linker tell you what did!), and it's very easily avoided.

    But recursive-header-includes are a pain in the butt. In some projects where I know the code isn't for public consumption, I just have a master header file and include it on every C file, and that master file includes everything else "in the right order". Why not? Literally a hundred files with just "include "main.h" " and then let the compiler sort it out. But that's not what the header files are designed to do... I should only be including what I need, I know. And the compiler shouldn't be as dumb as a bag of rocks when I miss something. But if it's going to be, then I'm going to make IT do all the hard, unnecessary work, because I'll be damned if the tool I'm using to make it easy to write tools I'm going to use is just going to spew errors at me and expect me to do the manual legwork. It's a machine... it should be doing that for me.

    1. Dan 55 Silver badge

      Your Makefile could (should?) set up INCLUDE_PATH to sort that out. Whether or not you want to include all subdirectories under /usr/include is your choice. But at least you've got it.

      If you don't like Makefiles, what about CMakefiles?

    2. Anonymous Coward
      Anonymous Coward

      There's a reason most of us wrap header files in multiple inclusion protection #ifdef/#define wrappers. Current compilers are also smart enough to automatically avoid multiple inclusion given the right flag.

      Doesn't fix circular dependency loops though.

      You've not experienced the true horror of dependency hell till you've used C++ templates ;)

      1. Anonymous Coward
        Anonymous Coward

        C++ Templates.

        What do you mean? C++ templates are delicious friendly things. I thoroughly enjoy working with them, they're great!

        <anon so that the men in white coats can't find me a take me off to a care home for derranged programmers>

      2. guyr

        #ifdef/#define wrappers

        Came to say exactly this. The problem of including a header more than once was solved at least 30 years ago, when I was doing C/C++ development consistently, using this #ifdef/#define pattern. I can't imagine any code passing a code review today without it.

    3. FIA Silver badge

      I always find it better to think of C as one giant file, as that's essentially what it is after preprocessing, can sometimes help you visualise how it all goes together.

      Also, IIRC in C you can declare an empty struct, use it as a pointer member, then flesh it out later, something like:

      struct foo;

      struct bar { struct foo *fooptr; }

      struct foo { int someval; }

      However when I just tried this clang was actually clever enough not to choke on the forward ref, so it may just be that being 'clever'and YMMV. (Can't be arsed to fish out a GCC to investigate further).

      FWIW, while C can be funny like this (and to be fair it is getting on a bit as a language; I mean an E type is lovely, but you have to accept the wipers don't work well, and you'll be cold in winter), it's not just a C problem, dependency resolution is a right pain. (increasingly so in todays 'Don't write it, assemble it.' programming culture).

      I was a Java dev for a few years, it has some lovely build tools, but I've still had the inevitable A includes B, C includes a different version of B and I really need A and C, but the Bs don't like each other.

      Also, the 'Why can't the compiler just do what I intend' wish is something we've all had, but as you point out, it's non determenistic, and that's something you really don't want, as it makes things a right pain to debug. (Trust me, been bitten by 'clever' tooling more than once over my career).

      You're spot on with Makefiles too, they're just obtuse... however, they are just one of the tools available. It'd be quite odd but you probably could use something more modern. if you wanted. (Personally I've just tried CMake for my latest, that's quite cool). Also used Qt's qmake in the past but I think that's possibly a little crufty for non Qt or bigger projects. There's also things like Ninja. (Which can be targeted by CMake).

      ... In some projects where I know the code isn't for public consumption, I just have a master header file and include it on every C file, and that master file includes everything else "in the right order". Why not?

      Why not indeed. Sounds a bit like 'Windows.h' to me. Perfectly valid technique.

      1. Tom 7

        Gradle always amuses me. Change 5 lines of Java code, download massive amounts of Gradle updates to build the 2Mb app!

        I have wondered over the years whether the build systems people seem to come up with these days should really be several times larger than the OS they run on.They also seem too separate from the compilers and seem to try and reinvent things that perhaps asking the compiler what it thinks would not be a better move.

        As for headers being re-entrant its normally a sign you need re-engineer some earlier stuff that you didnt realise would be a problem later and all it takes is an army of Tardises to sort out.

    4. Anonymous Coward
      Anonymous Coward

      Surely all solvable by the compiler being smart enough to do these things for you.

      1. Yet Another Anonymous coward Silver badge

        IBM did back in the 90s. The code was all in a database accessible through their Rational Rose tool. Then users realized that all their source was locked up inside an IBM proprietary system a d they would have to pay $$$ for ever

        1. Someone Else Silver badge

          IBM also did that back in the '60's. It was called PL/I (or PL/1), and when ever you screwed something up, it would oh-so-helpfully "fix" it for you, generally by making things worse than the original error, and often-as-not hiding the "fix" it made.

          Notice that no one is programming in PL/1 anymore?

          1. Brian Scott

            I quite enjoyed programming in PL/I.

            Can't seem to find a decent compiler anywhere for it though these days.

            1. Someone Else Silver badge

              There's a reason for that, you know...

      2. Lee D Silver badge

        I buy a computer to do the hard, repetitive, boring work for me.

        If it can't build a tree of the files, their dependencies, duplicates, and then work out which bits it needs from where - even via brute-force, then what's the point of the compiler/linker, really? And most of these problems are not circular dependencies - even when they are, they are solved by taking files partially rather than in a chunk. If B has a dependency on A is in file A which includes B at some point - that's easily resolved. You have everything you need.

        A proper circular dependency is literally un-compilable by anyone, ever. These file-level dependencies aren't - proven by the fact that you can eliminate them by excluding one of the files which isn't strictly necessary, or juggling the include orders. Parse file B. That needs parts out of file A. Parse file A. When you get to the bit that needs a definition from file B - oh look... it's just below the bit we're working on... if we hang on a moment, or look ahead, we'll get the answer. Or even just mark that as "grey", carry on parsing the rest of the file, then parse file A in its completeness and when it comes back... oh, look, we only needed this part out of B anyway and everything else is "not grey" now. Now resolve.

        Some compilers can do it. Some systems have had this for decades. Most software still gets written in C. GCC and associated tools are the most-popular/complete C compiler / linker / etc. Why is this not a solved problem?

        Is it really that difficult? Scan file A top to bottom, pull out all the named symbols and dependencies they have. Do the same for B, C, D, E. Now that you just have names and dependencies, resolve all the ones you can, whittle out the rest like you would a logic puzzle in a magazine. 1st stage you'll eliminate from consideration 99% of the entire codebase. Then it's just a handful of dependencies, and they'll mostly work themselves out via iteration. Anything left - ASK THE USER. It'll be a duplicate definition (i.e. same thing specified in two different files), a true circular dependency (struct A includes struct B which include struct A, which can never, ever compile anyway), or a false circular dependency (struct B includes a pointer to struct A, which contains a struct B, but it's just a pointer so who cares?).

        Literally, you could do this on perfectly compliant C code just by building a list of all the includes in all files, ignoring where they were *included from*, finding them yourself and parsing them just like files A, B and C. And then if you still spot an undefined symbol... have a quick shufty though a centralised database of all the include files in the standard include folders and see if you can make a suggestion. The database for a whole system with SDKs etc. and thousands of include files would be, what, a couple of meg or so (just the file name and the symbols it has defined in it).

        1. Richard 12 Silver badge
          Unhappy

          Macros are the problem

          Well, abuse of macros.

          There is a "fake header-only" programming style where you have to include the header (at least) twice, one of which has to be with a special macro defined, and the rest must not. Popularised by the STB libraries I think.

          It's now mostly used by people who think they're very smart but are actually very foolish, as it's basically impossible for IDEs to highlight and the worst possible case for the toolchain.

          A compiler that tried to work out this ridiculous mess would never be able to compile it at all.

    5. Gerhard Mack

      What is wrong with Makefiles?

      I don't understand what is so wrong with Makefiles? They are great when you want to build on a machine that isn't yours. For example, I code on my PC, but test on a sever (or other people compile onsite) Makefiles are great for dependency checking.

      For example, My Makefiles let the user know each step of the process with short descriptive messages, combine the source from the src folder and the includes from that folder (and sub folders) it also only re compiles only the needed source files if I make a header change.

      Admittedly there is some dark magic in the regex to make some of that happen but once written, it's done. I generally just reuse the Makefile with minor modifications on new projects.

    6. Anonymous Coward
      Anonymous Coward

      NeXT's implementation of Objective-C had an #import that did away with the need for preprocessor guards when using #include. Stallman didn't like it for no same reason I can find, so it got deprecated when GCC reimplemented its Obj-C support.

      1. Dan 55 Silver badge

        The reason is this.

        1. Anonymous Coward
          Anonymous Coward

          That's just a poorly written system header file.

          1. Dan 55 Silver badge

            Not really. There may be many different files in a project that may need just one part of a single .h file, and it doesn't always have to be the same part, but if the compiler says "I'm not going to look at that again, I've already done it" then something is missed.

    7. Anonymous Coward
      Anonymous Coward

      Learn Python.

  2. Falmor

    Google Sharing

    Wouldn't a more pertinent question about Google sharing be 'Does the Google code share any of the users activity with Google or any affiliates?"

    1. FIA Silver badge

      Re: Google Sharing

      ...surely it's "So... you've been playing with our toys for years, why are you only sharing yours now?" ;)

    2. Paul Shirley
      Joke

      Re: Google Sharing

      I'd be impressed if Google found a way for the source to share your stuff with them... Showing epic metaprogramming skills

      1. Dan 55 Silver badge
        Black Helicopters

        Re: Google Sharing

        Have they committed anything to gcc or llvm recently?

        Reflections on Trusting Trust

  3. Anonymous Coward
    Anonymous Coward

    One of the last releases?

    With Windows increasing direct supporting for Containers and Google abandoning it for Fuchsia, won't Linux become an irrelevance?

    1. Anonymous Coward
      Anonymous Coward

      Re: One of the last releases?

      Sure... sure...

      I understand you're trolling, but I see it as if Windows is becoming a graphical "Desktop" toolkit. KDE, Gnome... Windows?

    2. bazza Silver badge

      Re: One of the last releases?

      If Google do drop Linux for Fuchsia, well that'd be billions of mobile devices moving away from using Linux.

      I can't see Linux falling out of favour in server applications though. Whatever issues one might have with Linux's license and / or design and how it pans out on mobile devices, those issues don't impact on servers one little bit.

      And with Linux seemingly being open to using Rust - presumably possibly encompassing a wholesale move over to it, eventually - there's a chance that the Linux world will start to incorporate Rust's benefits in their security strategy. It remains to be seen whether or not that happens of course. But it seems that multiple OS teams are beginning to think about Rust at about the same time, Linux included, so I think there's a good chance that Linux won't get left behind as a pure-C dinosaur if Rust prooves to be the way to go. So it will probably remain relevant for a good while yet.

    3. Anonymous Coward
      Anonymous Coward

      Re: One of the last releases?

      "won't Linux become an irrelevance?"

      Just because Windows first adds a Linux subsystem and then becomes Linux does not mean that Linux ceased to exist.

      But it's nice to see the forward thinking marketers are preparing for the inevitable Microsoft Linux desktop revolution.

  4. Binraider Silver badge

    There's a reason even the Linux dictator-for-life has been mulling over Rust. David Braben's criticism of systems-level programming languages is a fascinating read. C++ versus C is a never-ending argument; though I would tend to suggest that the 'unpredictability' of C++ when mis-used is an issue for OS programming in said language. C clearly has no shortage of bad code submissions too; and the advantages of the dictator-for-life in control of what does or does not make the Kernel is perhaps symptomatic of the language and compiler's ability to deal with crud.

    One does somewhat feel that despite Linux' established base; that there is probably space for a research / hobbyist OS to be built from ground up using modern paradigms.That is, after all, precisely how Linus got started with Linux; out of desire to use the x86 chipset in full and expand a bit beyond the capabilities of other educational OS of the time (Minix being prime example). At the time, C happened to be the best compromise available between programmability and performance.

    1. bazza Silver badge

      REDOX OS

      You might like to take a look at Redox OS - written in Rust.

      What's impressive about that project is how little time there was between inception and having something semi-decent running; really, not long at all.

      Then there's the option of re-writing Linux. Or Windows. With C OSes I think you could have a rolling programme of conversion of the code base from C to Rust - AFAIK the two languages integrate just fine - and eventually end up with a predominantly Rust code base implementing essentially the same OS.

      None of that means that the Userland also has to be translated - sensible though that might be here and there - but with a Rust kernel one could be confident that the kernel's memory correctness was more assured.

      1. Inkey
        Pint

        Re: REDOX OS

        Hah bazza beat me to it

      2. dajames

        Re: REDOX OS

        You might like to take a look at Redox OS - written in Rust.

        What's impressive about that project is how little time there was between inception and having something semi-decent running; really, not long at all.

        Not really ... writing an OS isn't that hard in any language, and Rust is easily complete enough to crank out a basic OS. It's persuading people to write (or port) applications to run on a new OS that's really hard.

        Nobody wants to use an OS that has no applications, and nobody will write applications for an OS with no users (unless, maybe, it's the only OS that runs on some new hardware that is really shiny).

        Redox OS is an interesting system, but its claim to have a "full set of applications" really means no more than it has a basic set of OS tools -- we're a LONG way from seeing it compete with Windows, say. It's good to see that it's been done, and a good demonstration of the capability of Rust, but really not that remarkable.

    2. Inkey
      Coffee/keyboard

      Redox OS

      "that there is probably space for a research / hobbyist OS to be built from ground up using modern paradigms"

      I see theres a microkernel called redox os written in rust... its early days... theres no usb drivers yet and form what limited experiance i have could only emulate it ....

      I also read that Linus Torvards said microkernels were a bit pointless...and that he could see kernel drivers being written in rust

  5. Anonymous Coward
    Anonymous Coward

    "An energy driver for recent AMD CPUs"

    ... doesn't half seem to work. I've been running the rc's on my opensuse Ryzen5 laptop, and it's transformed the power consumption. The occasional mp4 transcode seems to have found more horsepower too.

  6. PeterO
    Facepalm

    Let the compiler help

    https://en.wikipedia.org/wiki/Pragma_once solves it for me :-)

    1. dajames

      Re: Let the compiler help

      https://en.wikipedia.org/wiki/Pragma_once solves it for me :-)

      Yes, but only on compilers that support it.

      Modern compilers that do support it will generally handle conventional include guards just as efficiently, and on old compliers that don't support it -- there still are a few -- you can't use it anyway.

      As that Wikipedia link points out, there are some caveats concerning #pragma once .

  7. Anonymous Coward
    Anonymous Coward

    "Clever compilers"

    I can't think of anything more guaranteed to fill me with dread.

    Imagine all the little "extras" a clever complier would put in for you ?

    Much better to write decent code to start with.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like