back to article Compromise reached as Linux kernel community protests about treating compiler warnings as errors

Pushback from the Linux kernel community over defaulting to -Werror (make all warnings into errors) for compiling has resulted in a compromise where this default only applies to test builds. Linux creator and maintainer Linus Torvalds amended the Makefile used to compile the kernel so that -Werror was the default, saying: "We …

  1. heyrick Silver badge

    Seems like a good idea

    Given that my compiler warns me when I screw up certain things like "use of = in condition context" and guff about pointer to non-equal pointer ... things that technically result in a valid executable, but not one that does what it is supposed to.

    Warnings are there for a reason - to warn you that it thinks something is wrong. If you're getting spurious unnecessary warnings (function blah not defined in header (because you forgot static)), there's usually some sort of option or pragma to turn that off. Or, maybe, fix the problem?

    It freaks me out that some people seem to think that thousands of warnings is not a big deal.

    1. Anonymous Coward
      Anonymous Coward

      Re: Seems like a good idea

      You should try being part of the support team for a product whose Administrators believe that until a server crashes there is not a problem. All those those messages in the log saying that things are not well can be ignored as clearly it is still working...until it isn't

      Pro-active Admin/Support seems to be an antiquated philosophy treated in the same way as bean-counters deciding redundancy is too expensive if you already have resilience. If you cannot prove the time you spend fixing things causing warning messages will save money then clearly you have too much time on your hands and maybe they can reduce the team size.

      Bitter? Moi? Heaven forfend

      1. Adelio

        Re: Seems like a good idea

        On a slightly different point. just go into the windows event log a few days after installing it and see how many warnings and errors there are .......

        1. Alan Brown Silver badge

          Re: Seems like a good idea

          Yep

          Leading to far too.msny instances of 'it works, until it doesn't and then it really doesn't'

          Let's not forget the same mentality saw shuttles continuing to launch with leaky o-rings and tiles being whacked by pieces of tank insulation

    2. Mishak Silver badge

      Re: Seems like a good idea

      It is generally a good idea to run like that (I do), but there are some compilers out there that emit false-positives. Not easy to manage if there is no way to suppress the warning and there is no problem to fix.

      1. teknopaul

        Re: Seems like a good idea

        When your project is 5 million loc having useful warnings hidden amongst many that are to be ignored is a problem in its own right.

    3. nevets23

      Re: Seems like a good idea

      But sometimes the compiler gives you stupid warnings that should simply be ignored.

      https://bit.ly/3l81Ez8

      1. Gene Cash Silver badge

        Re: Seems like a good idea

        bit.ly? Seriously?

      2. heyrick Silver badge
        Happy

        Re: Seems like a good idea

        Not sure that linked thread means what you think it means.

        Linus isn't seeing the weird bogus warnings and it looks like some sort of mischief with the other person's compiler setup.

        And Linus, like a boss: It's a classic case of "Doctor, doctor, it hurts when I hit myself in the head with an ice pick".

      3. Anonymous Coward
        Anonymous Coward

        Re: Seems like a good idea

        osnoise isn’t a compiler though.

  2. Tromos

    Warnings are usually there for a reason.

    Something like an unused variable warning should not be trivialized. I have come across a case where a variable was declared for a specific purpose but the code accidentally used a similarly named variable that was being used for something else. Turns out that two different values don't fit into one storage location.

    The only reason this is such a pain now is that -Werror wasn't the default from the start and things have accumulated over the years. I hope that a slow cleanup can take place with a view to eventually allowing -Werror to be the default for ALL builds. If it turns out that there are some truly trivial warnings, this is a good opportunity to either remove them or downgrade them to observations.

    1. AndrueC Silver badge
      Thumb Up

      Re: Warnings are usually there for a reason.

      Yup, been there very recently. I was reworking some code and after initial testing was surprised to see R# underlining a field and offering to remove it. I must have looked at the declaration for ten minutes before spotting a typo. It turned out I'd declared two fields with subtly different names instead of just one.

      In this case I caught it during initial development but it proves your point.

      I've also had cases where R# tells me a condition is always false or true and upon investigation those have turned out to be bugs that needed fixing.

    2. Alan Brown Silver badge

      Re: Warnings are usually there for a reason.

      Yup

      I've run into far too many cases of memory leaks and corner case crashes which wouldn't have happened if errors and lint outputs had attention paid to them in the first place

    3. Lunatic Moonshiner

      Re: Warnings are usually there for a reason.

      Time To Switch To Python. ;-)

  3. A.P. Veening Silver badge

    I'd say there should be a way to determine at what level a warning should change into an error. An unused variable from a copy book is a lot less serious than opening a file for output/update without any writing or deleting in that file.

    1. Mishak Silver badge

      Libraries

      You also need some way to manage the use of libraries that are provided as source code; it is possible that a particular application will only use part of the API, rendering large numbers of objects and functions "unused".

      Not a problem if you only consider "use" within a single translation unit, but more complex if you have system wide checks in place (e.g., when working to something like MISRA).

    2. Pascal Monett Silver badge

      "An unused variable"

      Should not exist.

      If you write your code properly, you know what variables you use and why. This is not a crapshoot, developers do not generally declare variables without a reason.

      If you have an unused variable, you need to check why it is unused because there is a chance that your code might be using some other variable instead, and that's when mayhem happens.

      If your variable is truly unused for good reason, then remove the declaration and recompile.

      Good code is clean code.

      Back in the day when I was learning how to program with IBMs BASICA compiler, I learned that there is not a single compiler message that does not warrant attention. If you have more than one definition of a variable in the Common area, you're in trouble.

      These days, I code business applications. An error is when an indispensable resource is not available. A warning is when a document is missing a given parameter. If the code encounters an error, it logs the problem and bails out. If it encounters a warning, it logs the problem and soldiers on to the next item.

      But I code for high-level applications, ie not kernel-level code. I cannot imagine a kernel module that has a "warning" like "HDD not available" that should not be looked into.

      1. Bill Gray

        Re: "An unused variable"

        An example: I maintain PDCursesMod, based on a library whose specification goes back to the late 1980s. In a perfect world, if I had a function such as

        int foo( int variable);

        which no longer makes use of variable and doesn't return a value, I'd change that to

        void foo( void);

        There is, however, a specification of what parameters are passed to foo and what it returns, and 40+ years of code written to that spec. I can't just toss my toys out of my pram and say "not gonna do that anymore". (If 'foo' has an unused local variable, then you're right; that should be fixed. In fact, an unused local variable usually means I did something wrong; unused variables passed to a function are usually -- though not always -- less worrisome.)

        I only get warnings when a variable is unused, not that the return value is unused. (I could imagine a savvy compiler figuring out the latter, but haven't seen it done.) To suppress such warnings, I define this handy macro:

        #define INTENTIONALLY_UNUSED_PARAMETER( param) (void)(param)

        and then, within foo(),

        INTENTIONALLY_UNUSED_PARAMETER( variable);

        which both says to the compiler "don't bug me about this" and to the human reader "yeah, I know this isn't being used; I planned it that way". (I've become a big fan of -Werror.)

        1. sqlrob

          Re: "An unused variable"

          You can get unused return warnings by adding an annotation to the function.

      2. A.P. Veening Silver badge

        Re: "An unused variable"

        If you write your code properly, you know what variables you use and why. This is not a crapshoot, developers do not generally declare variables without a reason.

        Depends, on the AS/400 the compilers just pull in all fields from the used files, so an AS/400 program without unused variables is pretty rare.

      3. captain veg Silver badge

        Re: "An unused variable"

        > Should not exist.

        Damn right.

        However, in my experience it usually means just that someone copy/pasted a bunch of code that included a variable declaration not used by the rest of the bunch. Which is annoying, but mostly harmless.

        Me, having started with edlin, I'd just remove the bluddy clipboard altogether.

        -A.

    3. m4r35n357 Bronze badge

      (void)variable;

      Just do it! Although I think I read about a future GCC not warning about that in the future . . .

      1. Anonymous Coward
        Anonymous Coward

        I have an Unused macro that I use for this, just to be clear about what I'm doing.

        On code that will be shared across multiple platforms and/or applications (read: client/server) where sometimes a variable is used and sometimes it isn't, I might declare additional macros:

        Server( variable )

        Xbox( variable )

        Linux( variable )

        etc.

    4. DS999 Silver badge

      An unused variable

      Shouldn't be ignored. It indicates that at minimum there were code changes made that were incomplete. If they/you forgot to remove now superfluous variables during that re-factor, maybe something else was forgotten as well. Worth double checking by looking at past commits before simply removing the variable declaration to eliminate the warning/error.

      If a single letter "junk" variable used for loop counts or indexes like i, j, x, or y then you might want to make sure that if the same variable was used for successive loops that there isn't some sort of dependency and the unused variable wasn't intended for the second loop.

      I've run into bugs related to re-use of a loop index variable in code I've written so now if I'm going to use a loop variable outside of a loop I either don't use a single letter name (which to me indicates a junk/throwaway value) and give it a more meaningful name specific to its use, or I'm careful to copy its value to a proper place immediately following the loop and add a comment as to why.

      1. martinusher Silver badge

        Re: An unused variable

        >If they/you forgot to remove now superfluous variables during that re-factor...

        You can get a cascade of issues if the function that used to use a parameter but now doesn't is called by a bunch of modules. In an ideal world you'd go through the entire code base refactoring everything to reflect the new definition but in practice you just can't do that. Disturbing stable and documented code is not a good idea because it cascades work -- remember, any change to a component, even if its just recompilation with a later version of the compiler, requires full testing of that component.

  4. Electronics'R'Us
    Windows

    A billion years ago...

    In internet time, we used lint (which could be somewhat frustrating as there are some warnings it just doesn't shut up about). Apparently the GCC folks decided that a separate program was Wrong [tm] and that warnings would be built into the compiler but they are not turned on by default.

    As frustrating as lint could be, it was an excellent tool to trap silly errors (and some not so silly).

    Treating development builds to the -Wall statement makes sense; if a particular warning is not an issue then there are ways to work around it (suppressing a specific instance of a warning is quite simple). Doing -Werr makes us consider the warning(s) more seriously.

    Source that may be compiled on a new compiler that may have new and interesting warnings of no interest at the current time should not default to -Werr after passing the 'no warnings' test at development time; after all, you want that build to succeed.

    Now I see the devs point of view here. To quote Henry Spencer, "De-linting a program which has never been linted before is often a cleaning of the stables such as thou wouldst not wish on thy worst enemies"

    Substitute clear all warnings for de-linting in the above and you have what is now being done.

    So -Werr is probably the right thing to do to prove valid construction (but not necessarily what the code will actually do...).

    1. DS999 Silver badge

      Re: A billion years ago...

      When I considered code "finished" I'd always make sure to include -Wall and force it to compile cleanly as the final step. That might mean a few useless casts or added parenthesis in a few places, but that would force me to be more careful when later making additions/changes.

      It doesn't work to do that when you first start a new program because you have too many warnings for unused variables you've declared but haven't yet written code to use, empty functions or functions that don't return a value, etc.

      1. itzman

        Re: A billion years ago...

        Yup. Sure its a faff when you get a null pointer returned from a function and apply a test to it and the compiler says 'but its not an integer' BUT if you explicily cast it to one, the next bloke along compiling onto a 16 bit integer target with a 32 bit address bus at least knows what you were TRYING to do...

    2. Anonymous Coward
      Anonymous Coward

      Re: A billion years ago...

      Anyone not using static code analysis today might as well just disable all warnings as far as I'm concerned.

      There are plenty of excellent, free tools available for this. Use as many as you can get your hands on. They all find more potential problems than the compiler, and they're all subtly different in how they work and what they find.

      Ditto for building with multiple compilers - even if you don't need to. No compiler is perfect, and the more chances you give for your code to break before it's shipped the better.

    3. Anonymous Coward
      Windows

      Re: A billion years ago...Or more

      In the days of big iron and COBOL, this wouldn't be a discussion. Any compiler warnings were corrected before moving the code to testing. Any log warnings were researched and corrected before moving the code into production.

      As the industry has agilely slid down the waterfall, I applaud Linus for reminding people what should be.

      No warnings does not equal healthy code but it does equal healthier code and warnings does equal unhealthy code (whether or not your code or someone else's is responsible).

  5. oiseau
    Thumb Up

    No warnings = healthy code

    ... any maintainer who has code that causes warnings should expect that they will have to fix those warnings…

    Once again, albeit after years of pain, Torvalds tells it as it is.

    In my opinion as an end user, a very necessary statement.

    We have already read about Torvalds complaining about incorrectly commented code and I see this as a follow up to that.

    But as I have mentioned before, it is not only incorrectly commented code: how much unfixed code has been piling up for years ie: won't fix code because it produced a harmless warning, did not affect enough people or because the kernel version was soon to be EOL'd?

    Of course, some may get snipped out by the next kernel version/s but most will probably not.

    And that's just dirt/grime being piled up and festering in some dark corner, much of it originated in the type of code that the strict implementation of -Werror will eventually eliminate.

    In time, kernel code without warnings will be a matter of course. ie: clean, healthy code with no useless crud left behind.

    Today, these warnings (the pain LT refers to) compose part of that dirt/grime which may eventually raise it's ugly head and end up breaking something further down the line and probably take a lot of time and manpower to fix.

    So whatever the cost the implementation of -Werror, it is better to assume it as early as possible.

    I see it as an unavoidable, essential part of healthy professional coding.

    ----

    The only way forward is lean and clean code.

    ---

    It is a concept that would seem to have been lost and Linux Torvalds is only reminding us all that he has not lost sight of it.

    Kudos to him.

    O.

    1. Pascal Monett Silver badge

      Re: professional coding

      Indeed.

      A true professional programmer is not just a guy who knows how to code, it is a person who knows what to do in a given environment with respect to data security and operational procedures.

      And now GDPR.

      One day, professional programmers will be held to the same standards as engineers.

      And that will be a Good Thing (TM).

    2. Mike 125

      Re: No warnings = healthy code

      No warnings == healthy code

      at least in my language.

      1. Paul Crawford Silver badge

        Re: No warnings = healthy code

        Sorry you are wrong as (warnings == unhealthy code) is a lot truer than (no warnings == healthy code)

        1. captain veg Silver badge

          Re: No warnings = healthy code

          This is why I like JavaScript. No compilation means no errors, and no warnings. Healthy code!

          This is a joke, by the way.

          -A.

  6. thames

    I have a free software C project that supports three different compilers on four different chip architectures. I took the time to get rid of all warnings as I didn't want someone else to have to dig into whether a "known" warning is a new problem.

    The big issue comes in when you have to use compiler extensions, especially to access things like SIMD features. Since they are non-standard, there is no common standard. Things like LLVM/Clang simply don't have the same extent of support that GCC has. MS VC does things differently. 64 bit ARM will do some things differently from 32 bit ARM and may need extra variables for CPU specific features in order to do the same thing. X86 is such a dog's breakfast of optional features on different chip models that it's pretty much hopeless to try to cover them all.

    The end result is that you need lots of #ifdef and macro statements to sort out the differences between compilers and chip architectures. You don't find out about these things until you actually test them however. And you can't test them if you don't have the hardware and compiler tool chain.

    So when anybody can rock up with a different chip, or worse, a different compiler, it's pretty much hopeless to expect anything other than fairly vanilla C code to pass through without some warnings. And if you get third parties who have these oddballs sending you patches full if $ifdefs that you can't test that's not really an answer either.

    I think a more realistic answer would be to set one compiler and a core set of chip architectures which must pass without warnings and leave dealing with the rest optional, depending on how realistic that is.

  7. Norman Nescio Silver badge

    Lack of warnings...

    ...means you can now concentrate on finding the really subtle problems.

    The fact that a compiler is 'happy' with your code by no means means it is healthy code.

    Many, many years ago, I was working on some programs that did quantum mechanical calculations to simulate molecules. I was debugging a set of programs as a result of trying to run them on a different architecture of CPU to the original. One of the subroutines did multiplication of two matrices together.

    Now, matrix multiplication is not generally commutative. The ordering of the arguments mattered. So if you have matrix 'A', and multiply by matrix 'B', you will, under certain circumstances get a different answer if you multiply matrix 'B' by matrix 'A'. In other words AxB does not always equal BxA.

    So, if you have a subroutine that accepts two matrix arguments, multiplies them and gives you the result, it is rather important that you specify them consistently in the correct order.

    Of course, in the thousands of calls to this particular subroutine from all over the program, the order had been mixed up in one or two, resulting in code that compiled fine, but gave entirely borked results.

    Lack of compiler errors means the code is free to have deeper problems that ruin your day.

    NN

    1. m4r35n357 Bronze badge

      Re: Lack of warnings...

      Nobody is expecting a compiler to pick up problem domain failures like that. But you can spend more time understanding the problem domain when you are not constantly having to decide which warnings you think matter.

  8. Anonymous Coward
    Anonymous Coward

    The way I like to look at this is:

    A warning is an error that doesn't prevent the compiler from outputting something.

    How valid that something is depends on the warning, but at the end of the day any code that produces warnings says to me "this code was written by someone who doesn't really care".

  9. Gerhard Mack

    I don't blame him

    Years ago I made this exact commit to one of my projects along with a set of annotations for all of the functions that used format strings. The other programmer threw a hissyfit and turned it all off again complaining about the fact that he didn't have time for all of that. A couple weeks later we try it on some nice new 64 bit servers and his code couldn't even last 30 seconds without a crash.

    Years later, he quit/got fired (depending on which side you ask) and I got stuck maintaining his code. I ignored the large bug list while enabling a set of warnings/ fixing them, in a loop for a few weeks. When the code was retested, 95% of the bugs were gone. His logic was fine and he could have saved himself years of effort had he fixed the warnings in the first place.

    1. PC Paul

      Re: I don't blame him

      I once had a year long contract supporting some similar old code. It had lots of weird "can't find them" issues.

      In between what they asked for (document and tidy up all the workarounds) I ran static analysis and -Werror and fixed those too.

      After the year was up there was no document to hand in for the workarounds as there were none left. Nor were there all the weird issues.

  10. John70

    The warnings are there for a reason. Fix and clean up your code.

  11. James 47

    Didn't some guy build the linux kernel with a C++ compiler and it uncovered a host of similar warnings?

    1. Primus Secundus Tertius

      Link level checks

      @James47

      Stroustrup, the creator of C++, wrote the following. His C++ compiler caused checks to be made that the parameter list for a library function in the main header file matched the list in the header file for compiling the library. So the link stage of compilation was checked.

      Lots of errors found in existing C programs recompiled with C++.

      1. DS999 Silver badge

        Re: Link level checks

        I'm sure a lot of them were stuff like how an internet address in C is of type "in_addr_t" which is a typedef of an unsigned int. So you might have a header that defines a parameter as "in_addr_t" in one place but as "unsigned int" in another. That doesn't even get into the conversion hassle that's needed these days since almost everything is little endian, but in the original C world almost everything was big endian so a lot of software skipped endian conversion entirely and everything worked!

        Header files also were sloppy about exchanging between pointer types. Technically you shouldn't do that except via passing through 'void *' but even that may cause problems, because some platforms may impose alignment restrictions on pointers to certain types making it impossible to e.g. convert a char * to a double *. In fact there are no guarantees that a function pointer is the same length as a data pointer, or that pointers to strings aren't implemented as a union of a pointer and a size (though AFAIK actual systems doing this became obsolete long ago)

        C allows more than what most modern systems implement, or at least it did in the past. Maybe C11 (or whatever is current) has finally done away with that sort of thing, but I doubt it. Getting rid of such unnecessary flexibility may make sense, but there will always be some who advocate for that to remain so as not to potentially hinder future hardware implementations.

  12. Ace2 Silver badge

    To look at it another way

    I’ve run into some truly stupid gcc warnings where it’s unhappy that you’re copying the first X bytes of one string into another string. “Warning! Loss of information!” No sh*t Sherlock, I only wanted the first part.

    There was NO WAY to hush it up. I had to switch to memcpy().

    Requiring -Werror on kernel builds would go a long way towards making sure that compiler crap like that never sees the light of day.

    1. Gerhard Mack

      Re: To look at it another way

      I would be interested to see the code you used for your copy because usually that warning indicates you are casting something to an incompatible data type and GCC thinks information is being lost in the conversion. Nothing to do with how much of the string you care copying. You only think the warning is stupid because you misunderstand the nature of the warning.

      1. Ace2 Silver badge

        Re: To look at it another way

        Gcc 8.3.1 and 9.2.1:

        ‘__builtin_strncpy’ output may be truncated copying 31 bytes from a string of length 63 [-Werror=stringop-truncation]

        Please enlighten me.

        1. Ace2 Silver badge

          Re: To look at it another way

          Downvoting you for insulting me and then not backing it up.

        2. Gerhard Mack

          Re: To look at it another way

          See now you've changed your error message. The previous message you added was almost exactly the one gcc gives when you cast between incompatible types.

          At any rate, with that new message, I do hope you were manually adding the null at the end of the string after you copied it.

          1. Ace2 Silver badge

            Re: To look at it another way

            Adding null at the end made no difference. Zeroing the whole destination string first made no difference. Copying 1 fewer byte made no difference. That feature / warning is just broken in those versions of gcc.

            My original point - which stands, thank you, despite your condescension - is that if the gcc people could always try to build a clean -Wall -Werror kernel before releasing a new version, they would be able to find problems like this.

            1. Gerhard Mack

              Re: To look at it another way

              GCC won't know that. Copying a partial string is a common way people mess up and end up with a string minus the end of string NULL byte and end up with string parsing functions processing unintended parts of memory. That is why the warning.

  13. Chris Gray 1

    Not so simple

    Some points and examples from an active compiler writer...

    Take a look at the system include files on Linux. Huge amounts of conditional compilation, special-use macros, etc. Even finding where a given struct is defined can be quite hard, and some of them will have multiple definitions, depending on CPU, SysV versus ATT, Linux versus BSD versus HPUX versus AIX, etc. etc. I bow to the folks who maintain that stuff - its not something most programmers are capable of. Doing it all without inducing warnings in any circumstance has to be exceedingly difficult.

    Distinguishing between development build and production build makes sense to me. Most of my source files are not dependent on stdio, so I don't include stdio.h or stdlib.h - just my own headers, and maybe string.h, typically. But, when chasing some kinds of problems, I need to have debug output, and that is simplest using printf. So, I can get reams of warnings from the compiler about those uses. I'm used to them. When I've found the problem and am checking in sources, all of those printfs will have been deleted.

    I've put some warnings into my compiler that many don't have, e.g. dealing with uses of constant expressions having no effect. But, I want to be consistent in my usage. So, for some data sizes and the higher warning levels, I get warnings. So, I've added the ability to temporarily lower the warning level around a bit of code, then restore it after that code. Comments mandatory there, of course. I imagine similar things can happen with other compilers and languages.

    So, I don't have -Werror, but I do have -Wall. And note that, annoyingly, -Wall with gcc does not enable *all* warnings. I haven't looked at them for a while, but mine are currently:

    -Wall -Wpointer-arith -Wstrict-prototypes -Winline \

    -Wundef -fno-exceptions -MMD -funsigned-char -fno-strict-aliasing \

    -Wstrict-aliasing -fshort-enums -Wno-char-subscripts

    1. DS999 Silver badge
      Pint

      Re: Not so simple

      Even finding where a given struct is defined

      I have LONG wished for a simple way to do this. The compiler is able to do it, I ought to be able to have a simple utility I can call that uses the compiler's C preprocessor to answer queries like this. I suggest the command name be called 'wtf'. For "Where Type Found", of course :) So I could type in "wtf weird_union_of_structs_type cfile.c" and it will show me something like:

      struct weird_type declared in firstheader.h line 11

      struct other_type declared in secondheader.h line 79

      union weird_type and other_type declared as weird_union_of_structs in thirdheader.h line 243

      typedef weird_union_of_structs to weird_union_of_structs_type in yetanotherheader.h line 88

      header yetanotherheader.h included by sys/types.h line 761

      header sys/types.h included by cfile.c line 3

      That would reduce a lot of repetitive grepping and pulling of hair when trying to trace types in Linux. Since you're a compiler writer, you can either implement this yourself or suggest it as a feature enhancement for someone else on your team to implement. Pretty please? Beer icon as your team will deserve a bunch of beers if you make this happen!

      1. Andy Landy

        Re: Not so simple

        ctags is your friend

      2. -v(o.o)v-

        Re: Not so simple

        Doxygene does this, probably others too, but it's probably not what you describe exactly.

    2. Bill Gray

      Re: Not so simple

      "...And note that, annoyingly, -Wall with gcc does not enable *all* warnings."

      True. I've been using -Wall -Wextra -pedantic -Werror. I'll have to give your suggestions a try; I'm not sure if they're included under the 'extra' or 'pedantic' categories or not.

      I _usually_ don't find it too difficult to figure out why a warning is generated and to then take appropriate action to fix it. About 99% of such warnings are harmless nits. But the remaining 1% can really ruin your day.

      At one point, I wished the compiler would let me write

      #pragma suppress_warnings

      (a line of code that generates a warning I don't see how to suppress)

      #pragma unsuppress_warnings

      I think I would now tell my younger self : work on it; you'll figure out how to get rid of that warning.

      1. DS999 Silver badge

        Re: Not so simple

        I think I would now tell my younger self : work on it; you'll figure out how to get rid of that warning.

        I recall a few cases in the past where I had a warning that ended up being caused by inconsistencies in the system include files. Not much you can do about that!

    3. Gerhard Mack

      Re: Not so simple

      "So, I don't have -Werror, but I do have -Wall. And note that, annoyingly, -Wall with gcc does not enable *all* warnings. I haven't looked at them for a while, but mine are currently:"

      I think the idea is that GCC doesn't break things with changes to the warnings. I have a script that gets the GCC version and enables all warnings for that version, passing that variable back to make.

      I really wish GCC would have a -Warnings=GCC_version or something so I can just tell it to enable everything.

  14. Zzznorch

    I Agree With Linus

    Almost 15 years ago I inherited a huge C project that generated about 17,000 warnings. The original team of developers never bothered to investigate them and was happy if the build completed without “errors”. I was always getting tasked with finding out why a customer would discover a weird bug that was usually a showstopper. Eventually I ran a good analyzer on the code and found that a lot of those warnings should have been errors. Perhaps 30% of the warnings were pointing out buffer overruns, invalid use of “=“ instead of “==“, potential stack issues. Took me ages to clean those up. The reward was a customer commenting several weeks after an upgrade at how stable the code suddenly was after the upgrade.

    I wholeheartedly agree that warnings should be errors. Code should build clean. Especially C.

    1. JBowler

      Re: I Agree With Linus

      -Wall -Wextra -Werror

      Then I -Wno- the ones that are stupid.

      The setting, however, is compiler specific. Each new GCC version immediately required a whole load of -Wno-'s because the GCC folks figure that if they have a new compiler release they can try dropping all the stupid warnings they had to remove last time in again.

      So the setup has to be compiler specific; clang should not be a problem because there should be a klanger, Great Uncle Bugarea perhaps, who selects the compiler errors; -Wno-errors-before-soup.

      Believe me -Werror is minor compared to the shite I had to put up with on the last OSS product I actively contributed to.

  15. YetAnotherJoeBlow

    I always compile with -pedantic-errors before a commit.

    I was taught to alleviate all warnings.

  16. Cliffwilliams44 Silver badge

    But with Windows so many of those warnings will just continue and nothing will ever break. There is so much useless noise in the Windows event logs you could waste your whole life trying to track down the cause only to have Microsoft tell you "this can be safely ignored!

  17. adam 40 Silver badge
    Flame

    if (foo = bar)

    I'm man enough to know what I mean, to assign bar to foo and test the resulting lvalue.

    All this namby pamby woke C programming gets on my tits.

    K&R edition 1 all the way!

    1. PC Paul

      Re: if (foo = bar)

      Oh, because I wrote that same line and I meant to write == but it got debounced. A lot of these warming are there to force you to not make the same easy mistakes over and over again. That's why they are warnings not errors.

      But when it comes to the production release of high value code like the Linux kernel it seems reasonable to make you split it into two lines and let the compiler di the optimisations

      Unless there was an implied /s missing as well

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like