back to article C: Everyone's favourite programming language isn't a programming language

Aria Beingessner, a member of the teams that implemented both Rust and Swift, has an interesting take on some of those (and other) language's problems – that C isn't a programming language anymore. Beingessner should know. They've previously worked on and written about both Rust and Swift. There are many problems with the C …

  1. cdegroot

    Nothing new...

    Even worse is the realization that hardware has bent to the whims of almighty C.

    (don't get me wrong - I love C, it was my first "proper" programming language, learnt it when it was indeed still a) simple to parse, b) a high level assembly language, but I think it's time to exit stage left and leave some space for the newcomers)

    1. Tom 7

      Re: Nothing new...

      the newcomers already have all the space they need. C is not stopping them - its just their language isn't really much better as it doesnt really shrink the massive problem space which is the hard thing to learn how to code in! The problems Rust and Swift 'solve' are all solvable in C and C++ build environments and software methods.

      1. mfalcon

        Re: Nothing new, kinda pathetic really

        C survives and thrives because it solves many problems well or well enough. Developers of new languages crying about C and its problems doesn't win converts. Its kinda pathetic. If these new wonder languages are so good why do they need to define themselves by trying to prove how much better than C they are?

        There is a lot to be said for solutions that work. In my work as a sysadmin I write lots of Bash scripts. Is Bash a great language? No. Is it the right tool for a lot of the work I'm doing, yes absolutely. The nice thing about shell script is that if I run into something too hard to do I can always write a bit of C and call it. Not necessarily pretty but it gets the job done. Language purity is for academic papers, not doing work in the real world. The real world is messy.

        1. Vometia has insomnia. Again. Silver badge

          Re: Nothing new, kinda pathetic really

          Thanks. You've just explained quite nicely why I still use Bash after so many years in spite of my litany of complaints about it. It does the job, and as much as alternatives might look somehow tempting, they don't do the job as effectively (or at all); and/or they're just too much effort and cba. I think maybe I'm becoming a bit unreasonable with age if I'm scolding it for sometimes taking as much as ½ a second to complete a procedure that involves checking through thousands of filenames with arbitrarily complex string wrangling (using a current example that had me fretting about "but why is it taking so long?!") especially as I could rewrite it in C, as you mention. I mean it isn't that long since I was using the college Vax which could take up to 5 minutes to log you in at stressful times.

          1. Paper

            Re: Nothing new, kinda pathetic really

            And well, a lot of what we do as IT folk is copying and pasting existing samples to save time and effort. Wanna parse a text file in bash and/or C? There's a litany of examples out there that you can copy and it just works. Wanna do the same in Java? I'm sure it's doable, but now I gotta think…

            1. Vometia has insomnia. Again. Silver badge

              Re: Nothing new, kinda pathetic really

              I find I've rewritten the same bits from scratch much too often. I've attempted to create libraries of my own stuff but they usually become the victim of ambition; but I wonder if my guilty secret is maybe I just enjoy it, maybe it's just my equivalent of relaxing with a cryptic crossword (probably a bad example as I can't do cryptic crosswords!) though I often find I can improve upon some of my older ideas with a fresh approach.

              1. adam 40 Silver badge
                Coat

                A Grotesque Simulacra of C

                I will be getting that printed on a T-Shirt

                It will frame my grotesque simulacra quite nicely.

                <I'll get my T-Shirt>

                1. Ian Johnston Silver badge

                  Re: A Grotesque Simulacra of C

                  You might like to correct the shockingly poor grammar of the original quote. "Simulacra" is the plural: it's "a simulacrum".

                  1. stiine Silver badge
                    Devil

                    Re: A Grotesque Simulacra of C

                    Ah, yes, but C is subtly plural...therefore...

                  2. Norman Nescio Silver badge

                    Re: A Grotesque Simulacra of C

                    You might like to correct the shockingly poor grammar of the original quote. "Simulacra" is the plural: it's "a simulacrum".

                    That boat has irrevocably sailed, the horse has bolted, and the fat lady has sung. Try buying a panino, which is not, as someone one earnestly told me, a very small piano. I have seen signs advertising the sale of paninis, people using criteria as a singular. Data is treated as singular. The typical speaker of English doesn't care about rules for pluralising foreign words in their own language, and simply slaps an English standard plural 's' on the end of a word to pluralize the concept: hence agendas, forums, memorandums, referendums, polyhedrons, dogmas, stigmas, cherubs, hinterlands, oblasts, ninjas, saunas, igloos, and to end with another Italian word pizzas.

                    If you can get the regular plurals in their own language of the above right without looking them up, I'll be impressed, so getting sniffy over simulacra isn't really worth it.

                    There are examples of words in common use that are plural forms in the original language but used as singular in English - opera and candelabra, so people talk of operas and candelabras, while opera is the plural of opus and candelabra the plural of candelabrum.

                    I used to get exercised over this sort of thing, but realised that making people learn all sorts of irregular plurals was just silly.

                    1. vcragain

                      Re: A Grotesque Simulacra of C

                      The point of spoken language is simply to communicate, and the English language changes constantly over time - we can barely recognize the way they spoke 500 years ago, and it is evolving as we speak (:>) so to be a snob for 'grammar' is really just silly. I find I hate a lot of common talk these days but since I'm now 82, it's not my world going forward - I barely understand my grandchildren's chatter these days ! BTW what the heck is 'woke' ? Still love to think about my early coding days starting with Assembler, then Cobol, Java, C, lots of fun, and now long since retired into obscurity. I would happily go back to those long nights solving elusive problems & ignoring dinner however ! Have fun & appreciate while you can everybody - this life is very short !

              2. phuzz Silver badge

                Re: Nothing new, kinda pathetic really

                Rather than write libraries, I just have a directory of all my various scripts, and usually the filenames are descriptive enough that I can remember which one contains the bit of code I'm about to re-use. It seems to work

          2. Doctor Syntax Silver badge

            Re: Nothing new, kinda pathetic really

            "it isn't that long since I was using the college Vax"

            It is, but only when you stop to think about it. Welcome to the club.

            1. Warm Braw

              Re: Nothing new, kinda pathetic really

              I try not to think about it....

              In this context, worth noting that there was something called the VAX Procedure Calling Standard that made it (relatively) easier to call one programming language from another, but there are some fundamental differences that can't be wished away.

              C was never even a particularly good language for writing operating systems - Linux depended for years on specific quirks of gcc, which isn't ideal: you'd like to get the behaviour you want by design rather than by accident.

              But when you have to get down to details like addressability, word length, byte ordering and specific machine instructions high-level languages can only get you so far.

              1. Liam Proven (Written by Reg staff) Silver badge

                Re: Nothing new, kinda pathetic really

                > there was something called the VAX Procedure Calling Standard that made it (relatively) easier

                > to call one programming language from another

                Which is the core point of Aria B's argument and the subject of the article.

                That *nix has no such native procedure-calling language or protocol or anything. All it has is calling C functions using C symbols.

                Which means that a team of dozens programmers working for years building a type-safe language, a program in which then calls a different type-safe language, can fall over because the communications between them need to be specified in their mutual FFIs, which must be specified in C structures and C functions, and thus flaws in the 1970s design of C can and do affect 21st century programs which contain no C code at all.

                1. stiine Silver badge

                  Re: Nothing new, kinda pathetic really

                  You're telling me that Rust and Swift can't interface to a 50 year old API? That says more about these languages than anything else.

                  1. The Velveteen Hangnail

                    Re: Nothing new, kinda pathetic really

                    > You're telling me that Rust and Swift can't interface to a 50 year old API? That says more about these languages than anything else.

                    Not at all. I won't speak for Swift since I haven't used it but Rust for certain is fully capable of bidirectional interaction /w C's FFI. That's why it's slowly getting incorporated in a lot of projects that were primarily C. The most notable example is the Linux kernel.

                  2. gnasher729 Silver badge

                    Re: Nothing new, kinda pathetic really

                    They can’t interface _nicely_. You need the ability to define structs that are exactly the same as in C. You need to handle lifetimes exactly the same as in C. And so on.

                2. Tim Parker

                  Re: Nothing new, kinda pathetic really

                  "Which means that a team of dozens programmers working for years building a type-safe language, a program in which then calls a different type-safe language, can fall over because the communications between them need to be specified in their mutual FFIs, which must be specified in C structures and C functions"

                  Except that that's utterly false. The languages talk to the underlying system primarily in C based interfaces at the lower levels in those systems that require it, but the interface between each language is up to them - as long as they agree on a protocol they can do whatever they like within the confines of the of the hardware and ISA. Part of the point of the article as far as I could see was that this is indeed the case, but because of the ubiquity of an established and widely used interface alternatives are tough to try and gain traction with, despite existing flaws.

                  ", and thus flaws in the 1970s design of C can and do affect 21st century programs which contain no C code at all."

                  What elements of the C system interface are a crippling impediment to high level language type safety at the user level ? What role does the base C system interface play in the interaction between the source language and, say, the AST of a language compiler ?

                  1. Man inna barrel

                    Re: Nothing new, kinda pathetic really

                    The problem with a language like Rust talking via its C FFI is that it erases some of the advantages of Rust, such as memory safety and a rich type system. You have a safe core of a library implemented in Rust, with an unsafe interface in terms of C.

                    I take the point of the article that C is possibly not the best way of doing this kind of portable interface job, but it is what we have got. It is a de facto standard. Anything new would run into the severe problem that until most people use your new interface standard, it is useless.

                    I note in passing that ASCII contains a considerable amount of cruft related to antique methods of data communication, but if you want portable text, it is very likely best to use ASCII. Does it really matter that some of the code points are wasted on control codes that nobody uses any more?

                3. Dizzy Dwarf

                  Re: Nothing new, kinda pathetic really

                  This is the original: <https://gankra.github.io/blah/c-isnt-a-language/> ?

                  It's like the guy doesn't know C at all.

                  1. anonymous boring coward Silver badge

                    Re: Nothing new, kinda pathetic really

                    "It's like the guy doesn't know C at all."

                    More like he doesn't know anything about anything at all.

                    Example:

                    "What Does Talking To C Involve?

                    Ok so apparently basically every language has to learn to talk C. A language that is definitely very well-defined and not a mass hallucination."

                    Huh? Reading anything more on that page is a waste of time.

                    1. jotheberlock

                      Re: Nothing new, kinda pathetic really

                      Any language on Linux (or MacOS) has to learn to talk C, to talk to libc, unless it wants to do raw syscalls which is unusual on Linux and flat out unsupported on MacOS. That's how Unix works.

                4. JoeCool Bronze badge

                  Re: The subject of the article ?

                  There are 15 paragraphs in this story. FFI is discussed in paragraphs 9, 10 and 11.

                  It doesn't read as focussed on FFI.

                  But, on the issue of languages intrefacing to the OS, isn't this problem already solved for other languages:

                  (1) by the gnu compilers and

                  (2) the java JNI

                  Or is there more to it ?

                5. jotheberlock

                  Re: Nothing new, kinda pathetic really

                  It does, though, if you're on any form of Unix. Calling procedures are defined by what was originally the Itanium ABI. Windows has its own thing, but either way there are two standard calling procedures, Windows and Unix (including MacOS, it does the same thing as Linux) for x86-64. That's as much as, well, any platform has, VAX included as far as I'm aware.

                6. martinusher Silver badge

                  Re: Nothing new, kinda pathetic really

                  >which must be specified in C structures and C functions, and thus flaws in the 1970s design of C can and do affect 21st century programs which contain no C code at all.

                  You can devise other models for parameter linkage but then you've got to design the hardware to implement them on. This is where you discover that to implement an instruction set that more complex than a RISC's will involve the use of microcode**. Which will have to be programed by someone using some kind of language which -- surprise! -- is quite likely to look like 'C'. Computing is all about layers and where you choose the boundaries between, and so the interfaces between, layers. Ultimately you're going to have to work with basic logic functions -- gates and latches -- and the history of computing has been all about taming the huge quantity needed to build a decent computing system.

                  **Random logic isn't going to help you, its the spaghetti code of hardware. Unpredictable, unpleasant and unworkable. Design languages like VHDL aren't going to help either.

              2. stiine Silver badge

                Re: Nothing new, kinda pathetic really

                "...Linux depended for years on specific quirks of gcc, which isn't ideal: you'd like to get the behaviour you want by design rather than by accident."

                I think you'll find, in most cases, that the behavior was defined by the limitations and/or vagaries of the hardware itself.

          3. Tim99 Silver badge

            Re: Nothing new, kinda pathetic really

            Thank you for the the Vax anecdote. We had a 11/750 for general computing and admin which could get a *bit slow*. Although it seemed fast compared to the DEC PDP-8/11 stuff that I used before then.

            One day the main IT bod came over to ask (aggressively/pointedly) why I had just written a case for a DG Nova 4 for data acquisition. He insisted on looking at a raw output oscillograph scan which would be converted into about 500 7 digit numbers (raw centroided peaks). He said that it was easy to do on the 11/750 so I didn't need the Nova 4. He asked what turnaround would be acceptable? It was real-time. How many sets of data? One every 2 seconds for an hour, run 5-6 times a day. I got the Nova 4, but he did argue that a 25 MB 19" rack-mounted Winchester disk was more than we needed, so he sourced 5MB Phoenix drives - I Could fill them in 3 days, 2 months later I got the Winchester drive, which also became too small, but I could dump data off onto tape...

        2. Snake Silver badge
          IT Angle

          Re: Annnnd...you completely missed the point of the article

          "C survives and thrives because it solves many problems well or well enough. "

          Does it? Does it, really? Exactly how many CSV's do we need to show that C doesn't really solve "all problems well or well enough", because its inherent obscurity in handling some constructs (as the article mentioned) leave holes that the developer is [constantly] responsible for solving.

          Every single time.

          C programmers are so entrenched in their belief of superiority that they dismiss the inherent problems in the design and (current) implementation of the language on modern computers, computers that are logarithmically more complex than the computers the language was originally designed to run on. System complexity that C was never designed to handle intrinsically, leaving the modern developer to constantly dot the i's and cross the t's in [trying to] assure that the code handles faults and attacks properly.

          Yet, instead of leveraging the power of these complex computers and let the language construct do this work for them, C programmers doggedly hang on to their beliefs of superiority.

          THAT'S the point of the article.

          1. bombastic bob Silver badge
            Linux

            Re: Annnnd...you completely missed the point of the article

            C programmers are so entrenched in their belief of superiority that they dismiss the inherent problems in the design and (current) implementation of the language on modern computers, computers that are logarithmically more complex than the computers the language was originally designed to run on

            Computers made of 'straw'? OK that was a different kind of 'logic'. [Bad PUNishment. yeah]

            I noticed you also used the word 'modern'. I am reminded of how THAT has been misused to describe "fad of the week", usually another FLATSO user interface or some "new, shiny" something that's trying to become relevant.

            Seriously I have not seen any CPU or system designs, with the exception of quantum computing, where a programming lingo like C would NOT be useful. (for quantum computing I'm still waiting for a programming model that actually works)

            (If it can run Linux, it can be programmed in C)

            1. Anonymous Coward
              Anonymous Coward

              Re: Annnnd...you completely missed the point of the article

              Parallelism, hardware threading, systolic arrays, NUMA, immutibility of records, transactional memory, mailboxes, caching, segmented memory architectures...

              No representations of these things are first order citizens in C, but many modern processors feature this sort of hardware and have to go to great lengths in hardware to abstract it so that it still sort of resembles a PDP-8.

              1. Paul Hovnanian Silver badge

                Re: Annnnd...you completely missed the point of the article

                "Parallelism, hardware threading, systolic arrays, NUMA, immutibility of records, transactional memory, mailboxes, caching, segmented memory architectures..."

                All supported by libraries which can easily be linked into a C executable. Nobody does much more than Hello World with the basic K&R defined C.

                How will all of the investment made in creating these libs carry over to Swift/Rust? Even with shims or wrappers, the underlying libs were still written in C. With all of the vulnerabilities that the new languages seek to eradicate.

                1. The Velveteen Hangnail

                  Re: Annnnd...you completely missed the point of the article

                  > How will all of the investment made in creating these libs carry over to Swift/Rust? Even with shims or wrappers, the underlying libs were still written in C. With all of the vulnerabilities that the new languages seek to eradicate.

                  Almost any C library can be pulled into Rust unless it's doing something particularly weird, so that's generally not an issue.

                  WRT the potential vulnerabilities of imported C libraries, yes that is also true. What's your point? That we just give up and not bother? That's absurd to the point of nonsensical.

                  The obvious approach is to steadily migrate to languages like Rust where safe programming principals are baked into the language, rather than the haphazard mess that C is. And it's already paying dividends. eg: https://www.theregister.com/2022/12/02/android_google_rust/

              2. Anonymous Coward
                Anonymous Coward

                Re: Annnnd...you completely missed the point of the article

                Please name these "modern processors". And for bonus points mention the ones that have a significant market share.

                1. In total, your posts have been upvoted 1337 times

                  Re: Annnnd...you completely missed the point of the article

                  Oh I can think of a few... Anything x86 since about Pentium III, ARMs from about CA15 upwards, any Nvidia or AMD GPU architecture from at least the past decade, IBM Power 5 onwards... Basically anything recent not strictly classed as embedded.

              3. bombastic bob Silver badge
                Devil

                Re: Annnnd...you completely missed the point of the article

                Parallelism, hardware threading, systolic arrays, NUMA, immutibility of records, transactional memory, mailboxes, caching, segmented memory architectures...

                Ah, I haven't written that kind of stuff in quite a while, though my past experience in the 90's comes in handy working with microcontrollers.

                I wrote a nice sort demo tool (first wth MFC, then later adapted to wxWidgets, but for all practical purposes C code with some C++ thingies in there for the GUI) that has a multi-threaded quick sort that uses a bit of what you mentioned. Multi-thread quick sort, DFT, and even a 'value of Pi' program, all somewhat trivial multi-threaded algorithms. Not hard, 'CreateThread' for Win32 or 'pthread_create' in POSIX and you can do it, too. And do not forget background IO processes, semaphores, queues, and all of the other stuff that goes with it.

                All in 'C'. Not a problem.

                (I even wrote a cooperative threading library for 16-bit windows, to help solve performance issues, and it worked VERY well - and I solved the segmented memory issue by use of USE32 code but of course by then 8086/88 and 80286 processors were no longer in use)

              4. Justthefacts Silver badge

                Re: Annnnd...you completely missed the point of the article

                For most of these, the answer is just libraries. And bingo you have yourself a “framework” in the modern lingo (or board-support library as I’d call it). You don’t need “first order citizens”.

                Note how one area on your list, cache-awareness, NUMA, memory segmentation, is both the most important and unaddressed by *any* language. That’s because “thought leaders” can’t do it without having genuine deep understanding rather than just pushing the trend-du-jour of hierarchies of abstraction and memory/type safety.

          2. Roland6 Silver badge

            Re: Annnnd...you completely missed the point of the article

            > C programmers doggedly hang on to their beliefs of superiority.

            I don't and I helped write one of the first C development environments for the PC back in the 1980's.

            From the article it would seem Classic K&R White book C has suffered from poor standards development over the decades so the issue of intmax_t speaks volumes about the poor quality of input into the C99 specification rather than whether or not 'C' is a programming language.

            There is a lesson here for the supporters of other languages such as Rust and Swift here: your language might be good and well specified today, but poorly thought out enhancement to the language specification and standard over the decades can lead to massive maintenance problems.

            The 'superiority' of 'C' always was both a marketing claim to sell it against FORTRAN, COBOL etc. and something "hobby" programmers used to claim that by knowing 'C' they were somehow more professional than those that only knew VB and/or Pascal.

            Also, just like Windows and Linux, if you build a new platform most these days will tend to build for Windows and/or Linux rather than develop their own new OS, likewise having built your platform it makes sense to include tools that many are familiar with - hence the 'C' compiler. Note all this is determined by marketing not C developers.

            As the article makes clear, some languages better support certain types of IT problems than others, and so the best advice is to use the right tools. That's why in the 80's in addition to using 'C' I also learnt Occam and ADA. Hence my advice today is whilst you might prefer to write in Rust say, you should also learn Swift and other languages including COBOL and FORTRAN, so that you are able to intelligently select tools...

            1. Tim99 Silver badge
              Coat

              Re: Annnnd...you completely missed the point of the article

              It seems like only a short while ago that academia was teaching Pascal and frowning on goto. I wonder how that turned out?

              1. Mike Pellatt

                Re: Annnnd...you completely missed the point of the article

                It turned out with Oilivetti writing one of their OSs in the early 80s in Pascal, because some NCGs believed what they'd been taught.

                Needless to say, didn't turn out well. Fortunately the OS from the previous generation of kit worked on it.

              2. Roland6 Silver badge

                Re: Annnnd...you completely missed the point of the article

                Goto was considered harmful well before the rise of Pascal - a language primarily for teaching computing fundamentals that avoided the mess of Basic and the complexity of Algol. In my second year the coursework for one module was to write a Pascal compiler in Algol-68.

                Whilst Pascal has largely been superceded, Goto is still considered harmful.

              3. ssokolow

                Re: Annnnd...you completely missed the point of the article

                I don't know about other academics who may or may not have missed the point, but Dijkstra's famous "considered harmful" is about unstructured GOTO... which C forbids. That is, allowing you to GOTO the middle of some arbitrary function, bypassing its beginning.

                Structured GOTO, like in C, enforces the principle that all functions have a single entry point and that was the main issue.

                Aside from raw assembly language, I can't think of anything seeing use in new developments in the last few decades which supports unstructured GOTO.

                1. bombastic bob Silver badge
                  Devil

                  Re: Annnnd...you completely missed the point of the article

                  a very good distinction. This is correct, you cannot jump into the middle of a 'scope' or it will throw an error if you try to do or access anything that is "out of scope". It's also why I tend to avoid using goto except for error handling.

                  1. captain veg Silver badge

                    Re: Annnnd...you completely missed the point of the article

                    > you cannot jump into the middle of a 'scope'

                    You most certainly can. It's not called "goto" but "setjmp" and "longjmp".

                    It might not be a terribly good idea most of the time, but anything representing itself as portable assembler sometimes needs such shenanigans.

                    This is the point that the article misses. In C you can do pretty much anything that is possible in assembler. Though you might need an _asm section.

                    -A.

              4. captain veg Silver badge

                Re: Annnnd...you completely missed the point of the article

                > I wonder how that turned out?

                Quite a lot of Pascal was subsumed into (classic) Visual Basic, which formed the basis for .NET.

                -A.

          3. eldakka

            Re: Annnnd...you completely missed the point of the article

            Nice strawman, responding to something the OP you are replying to didn't say. You even quoted what they said, then responded to something they did not say. That is, your response was to your own statement: (emphasis mine)

            do we need to show that C doesn't really solve "all problems well or well enough"

            Which isn't what the OP said - or even implied - as you very well know, since you did initially quote what they actually did say, which was: (again emphasis mine)

            "C survives and thrives because it solves many problems well or well enough. "

            Many (as OP used) != all (as you used in your response) and your entire argument is based around.

            No one, apart from you, is claiming that anyone has said C can solve all problems, or that C doesn't have any problems.

            Nothing is perfect. But because something isn't perfect doesn't mean it's useless, that it doesn't have a place.

          4. Liam Proven (Written by Reg staff) Silver badge

            Re: Annnnd...you completely missed the point of the article

            > THAT'S the point of the article.

            Correct. Or it's one of them, anyway.

            Whether this was Aria B's point, it is not for me to say.

          5. werdsmith Silver badge

            Re: Annnnd...you completely missed the point of the article

            "C survives and thrives because it solves many problems well or well enough. "

            Does it? Does it, really? Exactly how many CSV's do we need to show that C doesn't really solve "all problems well or well enough",

            Nice try. Did you really think readers wouldn't notice?

            Edit: I replied from up the thread before I saw the other response from Eldakka

          6. JoeCool Bronze badge

            Re: Annnnd...you completely missed the point of the article

            "C programmers are so entrenched in their belief of superiority"

            There is absolutely nothing in the article that raises a human/cultural issue of C blinding it's users to good design.

            The closest it comes to that sentiment is paragraph 14 of 15 ( so, almost an aside, to me ) :

            "We can't beat Beingessner's description, though: "My problem is that C was elevated to a role of prestige and power, its reign so absolute and eternal that it has completely distorted the way we speak to each other.""

            And that statement is a joke; C was not bestowed a monarchy; it took dominance by obliterating every other competing alternative, by being a better designed, more effective language and tool for the tasks that mattered.

            You are ladling on your own divergent prejudices, and not even explaining them very well. It's as if you see your assertions as accepted truth, not needing elucidation. They are not, and they most certainly do.

            1. captain veg Silver badge

              Re: Annnnd...you completely missed the point of the article

              > [C] took dominance by obliterating every other competing alternative

              In the space I worked, either application languages didn't directly expose the O/S API (e.g., from my experience, Pick or OS/400) or you had to load parameters into specific registers and execute a CALL or software interrupt. C-based APIs at least presented a procedural interface.

              -A.

        3. bombastic bob Silver badge
          Thumb Up

          Re: Nothing new, kinda pathetic really

          Language purity is for academic papers, not doing work in the real world

          see icon

          1. LybsterRoy Silver badge

            Re: Nothing new, kinda pathetic really

            I upvoted the OP but this needs upvoting anywhere its seen.

        4. Anonymous Coward
          Anonymous Coward

          Re: Nothing new, kinda pathetic really

          This. C has been around for 50 years, with 50 years of libraries to go.

          It always baffles me how it always seems like these new languages have to push for some sort of coup d'etat where overnight the newcomer penetrates everything from embedded processors to desktop applications. Might be part of the instant-everything syndrome. If Rust/Language X is so much better, it will prevail. Be it for kernels, web assembly or both.

          I could even mention how useful it is to learn a language like C that forces you to learn to think more like a computer, rather than a compiler who goes out of its way to pretend you can just write plain english instead of code, but I am not sure whether that'd be more of an emotional rather than a rational argument.

          1. Roland6 Silver badge

            Re: Nothing new, kinda pathetic really

            >I could even mention how useful it is to learn a language like C that forces you to learn to think more like a computer, rather than a compiler who goes out of its way to pretend you can just write plain english instead of code

            Depends on what it is you are wanting to achieve. Having learnt 'C' or other similar procedural languages, non-procedural languages like Prolog can be a bit of a challenge.

            Personally, I enjoyed the purity of Algol-68 as it allowed you to forget about the machine and focus on algorithms and data structures. Prolog and LISP were likewise interesting, if only to challenge the brain. Now, writing programs for highly parallel or distributed environments, that was a challenge...

            1. Electronics'R'Us
              Holmes

              Parallel C

              Many years ago (the turn of the current century) I was writing diagnostics for a massively parallel video on demand system.

              Massively parallel then is not what we think of today, but being capable of over 2000 simultaneous streams of MPEG2 was quite an achievement.

              The system itself had up to 320 parallel processors arranged as 5 cores in each processor (4 + parity) and 5 disk drives for each. (The system could rebuild a swapped drive on the fly).

              Programming this beast was interesting as we had a bespoke compiler. The system itself was SIMD.

              The compiler introduced the conditional keyword where; if it evaluated true, the core in question would execute the instruction stream - otherwise it would execute NOPs.

              An interesting design decision for the compiler was that the default storage class for variables was static which caused no end of hilarity.

              One of my colleagues was trying to debug a particular function which had the statement:

              int x = 0;

              He was trying to understand the very weird results and suddenly the light hit me - the variable needed to be initialised at each function call, so I made the trivial edit:

              int x;

              x = 0;

              All was well after that.

              A very interesting piece of kit to work with.

          2. Anonymous Coward
            Anonymous Coward

            Not like most of C replacements are new

            There really aren't many reasons not to at least use C++, and half of the people talking about C really mean C++ (probably a more recent version of it too). C is 15 or so years older, but at 50+ who cares? One is a relic of the 1970s and the other the 80s. You can practically see the coke jitters in the APIs.

            C# 20 years young, what a baby. And most of the syntax and skills of Rust are built from the same base of ideas, it just has newer and stronger tools to help people avoid shooting themselves in the foot.

            Yeah, you CAN build something using OG 50 year old tools. But setting aside garage craft projects, who in their right mind is going to write a paycheck to someone who is using an hand file and an abacus instead of power tools and a calculator?

            The technology and the tools moved on for a bunch of good reasons. Long before Rust came along I saw the cost to a company in a toxic co-dependency with a legacy ANSI C code base. They never wanted to update the code and refactor everything, so they dug a deeper and deeper hole of technical debt. Parts of the product screamed performance wise, but the code was brittle and temperamental at the best of times. It was also buggy, insecure, hard to test and debug, and changing anything was quite challenging. We had to implement our own versions of things that were off the shelf parts of the newer C++ standard libraries. It was a tragically stupid project management decision that "saved" money on a 3 month horizon and and wasted about 35% of the annual budget on band-aids.

            Then when the programmers started to quit, people had to try to decipher code that might have been less intelligible than old cuneiform tablets.

            When I work someplace, I appreciate landing somewhere that my first task isn't cleaning up someone else's mess, and I try to leave under similar terms. Part of the newer versions of the C family are tools to standardize how we do things, which if you stick to them makes working in a team easier. Every few years I suck it up and skill up, so that I can adapt to and adopt those new tools.

            It's interesting to me to see how those that are the most vocal for C at this point also tend to be the most anti-social. I disagree that it is solely the purview of dinosaurs and amateurs at this point, but at the same time, I wonder why many are willing to fight so much to hold onto it.

            1. Yes Me Silver badge

              Re: Not like most of C replacements are new

              "One is a relic of the 1970s and the other the 80s."

              But then, object oriented programming is a relic of the 1960s (SIMULA-67). There's nothing new here, and while I quite like Rust it could have been designed in the 1970s, a vintage decade for inventing systems programming languages.

              1. Anonymous Coward
                Anonymous Coward

                1970s

                Wirth, Bauer, Hoare and so on had quite a few solid ideas which were somehow buried under the Unix/C wave of the American Telephone. Just don't think American Telephone is always right.

                If you want to see another way how America is wrong, see this:

                https://www.youtube.com/watch?v=x3YueCf1JeI

                as opposed to

                https://www.youtube.com/watch?v=erWPD71BL1A

                The V22 killed 40 men during development, the Dornier 31 killed none. Since the early 1970s, the British-German Do31 has better performance figures than anybody else in the class of heavy lift VTOL a/c.

                Currently the Do31 hibernates in Munich until we will resurrect it one day.

              2. david 136

                Re: Not like most of C replacements are new

                Probably too big for the 70s on most machines. It's on the scale of pl/1, and that did not have traction outside the mainframe world.

                80s I'll give you.

            2. Baximelter

              Re: Not like most of C replacements are new

              And it's a good thing you are anonymous. Otherwise we old-timers might come for you with pitchforks and torches!

            3. jake Silver badge

              Re: Not like most of C replacements are new

              "You can practically see the coke jitters in the APIs."

              That wasn't coke, son. That there was coffee or (later) Jolt.

              Only the disco-dorks used coke ... ever see a geek/nerd at a disco? QED

          3. stiine Silver badge

            Re: Nothing new, kinda pathetic really

            The flip side of that coin is systemd where, even thought it was incomplete, its releases continue to absorb more and more systems, while still remaining incomplete.

          4. JohnTill123

            Re: Nothing new, kinda pathetic really

            As per your last point, I am quite certain it is a rational argument. I used to do a lot of science-related number crunching, and understanding what the computer was actually doing is critical to getting a program to be efficient. Just knowing how an array unfolded in memory and programming accordingly could speed up a program manyfold.

        5. ITMA Silver badge

          Re: Nothing new, kinda pathetic really

          Who remembers a bit of software called The Last One?

          LOL

          1. Steve Graham

            Re: Nothing new, kinda pathetic really

            I've got a plastic folder right here that has a sticker I picked up at a computer show. It says "I've seen THE LAST ONE".

          2. Ian Johnston Silver badge

            Re: Nothing new, kinda pathetic really

            I remember it! Ran on 1980s micros (or maybe just one) and was supposed to replace all other programs.

            1. ITMA Silver badge
              Devil

              Re: Nothing new, kinda pathetic really

              "The Last One - The last piece of software you'll ever need!"

              Yeah.... Right....

              I remember Microsoft saying something similar about Windows 10...

        6. bazza Silver badge

          Re: Nothing new, kinda pathetic really

          @mfalcon,

          "If these new wonder languages are so good why do they need to define themselves by trying to prove how much better than C they are?

          I think that, in the case of Rust, they have actually got a point. It's a systems language, like C, but eliminates whole classes of bug in the compiler. That is an advantage.

          I'm a decades-long C programmer, but even I'm thinking that Rust is the way to go. Even one of the biggest C projects out there - the Linux kernel - is moving to accomodate Rust...

          1. Yet Another Anonymous coward Silver badge

            Re: Nothing new, kinda pathetic really

            You also know that C is going to be around forever.

            Instead of: FAANG company announces amazing new language - drops it 3 years later, or FAANG company hires creators of sexy new language - assigns them to internal advertising project.

            1. bazza Silver badge

              Re: Nothing new, kinda pathetic really

              Well, they said that about Algol66...

              Languages can die very quickly. C++ took a big knock when universities started teaching Java. Java took a big knock when they reaslised it was even easier to teach Javascript (with the advantage of more easily getting pretty pictures for one's labours). Java is also being heavily pummeled by C# on Linux (yes, I've seen Linux Java die hards convert to C# on Linux with relief and enthusiasm).

              The only reason why C hasn't died is because there's not been an alternative (there simply aren't that many systems languages around). Rust is pretty much the first viable candidate that's rocked up in, well, decades.

              Good C skills are already a rare resource, and if those university courses that presently teach it start switching to Rust (because it's easier to teach well), then the supply of C programming resources dries up.

              1. Eclectic Man Silver badge

                Re: Nothing new, kinda pathetic really

                bazza: "Good C skills are already a rare resource,"

                Well, my 'C' skills are egregious, so don't offer me job, but if that is the case there will be a premium on people who can understand, maintain and write C programs in a few years time, just like the Y2K issues with Cobol.

                1. bazza Silver badge

                  Re: Nothing new, kinda pathetic really

                  Yep, that is what will eventually happen. Probably worth brushing up the old C...

                  1. ITMA Silver badge
                    FAIL

                    Re: Nothing new, kinda pathetic really

                    Bring back B(cpl)....

                    And who wrote this rubbish?

                    "C (1972) was the very first high-level language."

                    from:

                    https://www.learnacademy.org/blog/first-programming-language-use-microsoft-apple/

                    F-

                    1. swm

                      Re: Nothing new, kinda pathetic really

                      BCPL was an interesting language. Everything was a 16-bit word and context determined its "type". I remember writing on the XEROX ALTO something like [161223, 000777](1,2) which would treat the array as a function and call it with the arguments 1 and 2. The first number was a jump to microcode and the second number was a return statement.

                      Almost all code on the ALTO was written in BCPL.

              2. david 136

                Re: Nothing new, kinda pathetic really

                I liked D, but it lacked funding.

                Zig looks interesting.

                The problem is in getting enough support in infancy to become a viable adolescent.

          2. jake Silver badge

            Re: Nothing new, kinda pathetic really

            "Even one of the biggest C projects out there - the Linux kernel - is moving to accomodate Rust..."

            Is it? Are you sure of that? When talking about Rust, Linus uses words like "we might" and "maybe we will" and "perhaps" and "eventually", and "drivers, probably" etc. etc. Nowhere does he say "Let's do it" or "We are going to" or "It will be soon".

            He also is on record as saying ""I don't think Rust will take over the core kernel, but doing individual drivers (and maybe whole driver subsystems) in it doesn't sound entirely unlikely." ... but again, he's not entirely enthusiastic. He has also said "It might not be rust", which to me is a death knell.

            I've been reading the LKML for as long as it;s been around, and from my perspective it looks like Linus isn't really interested in any language that isn't C for kernel use ... not C++, just good old C ... and seeing as Rust is a replacement for C++, not C ... well, do the math.

            I think he's throwing the yowling, baying fanbois a bone just to shut them up. We might get a few drivers & the like written in rust over the next few years, but the vast majority of the kernel will still be in C long after the next language du jour takes the place of rust in the fanboi's fancy.

            1. bazza Silver badge

              Re: Nothing new, kinda pathetic really

              If you think Linus is the kind of guy who'd toss a bone to shut people up, I don't think you've been paying enough attention to things beyond the LKML...

              Also, if you think Rust is a replacement for C++ and not C, then you're not really keeping up with Rust either. Rust and C mix nicely. Rust and C++ - less so, you have to give your C++ methods a C calling interface...

              Rust in the Linux kernel proper may or may not happen under Linus's stewardship, but in a sense that's irrelevant. Amongst OS developers there is growing enthusiasm for Rust - Google are using it in parts of Fuschia, MS are looking at adopting it in the NT kernel, there's whole OSes implemented in Rust. There's a good chance that, at some point, OSes will start stratifying into the old one written in C, and the newer / updated ones that start using Rust.

              If that does indeed start happening, then ones that stay stuck with C start having a strategic problem. Linus is only too painfully aware of the lack of resources working on the Linux kernel, and I think that his (by Linus standards) tremendous enthusiasm for something as radical as Rust is a tacit admission that, at some point, Linux might have to move with the times. The work being done now is strategically sensible.

              1. stiine Silver badge

                Re: Nothing new, kinda pathetic really

                And Google will continue that for 3 years and then cancel the project?

                1. ssokolow

                  Re: Nothing new, kinda pathetic really

                  They're also using it in Android, which they're less likely to abandon. For example, last I checked, they're developing a replacement Bluetooth stack in Rust.

                  (Which makes sense. Android roughly follows that "~70% of CVEs are memory-safety oopses" statistic, but the bluetooth and media components are up at 90%.)

                  ...and good on them. I like Rust but I'd be just as happy if they used something else that achieves the same ends. For example, they cooked up Wuffs, a more domain-specific compile-to-C programming language for writing file format parsers which, by virtue of being domain-specific, manages to achieve even better safety than Rust.

                  1. stiine Silver badge

                    Re: Nothing new, kinda pathetic really

                    Hmmm. I thought Fuchsia was going to replace android from the inside.

            2. Binraider Silver badge

              Re: Nothing new, kinda pathetic really

              A ground up system founded on Rust would have potential to evolve. Adapting and expecting the old ecosystem to move won’t do away with the original design limitations created by C and associated hardware abstraction. Just like your 30 year old production database moves through multiple db hosts. But never gets away from its design deficiencies. Eventually, addressing those deficiencies becomes worth it - no matter how massive an undertaking.

              Linux was C because reasons, and so the future system has potential to be something else.

          3. eldakka

            Re: Nothing new, kinda pathetic really

            How many kernel developers are proficient in C?

            How many kernel developers are proficient in Rust?

            What's the existing library ecosystem in C like?

            What's the existing library ecosystem in Rust like?

            I'll bet you the answers to those questions will have C orders of magnitude better off than Rust.

            Whatever the relative technical merits of either language, or any language, for any language to displace C in existing projects of the scale of the Linux kernel, it'll have to be approaching C in those aspects before it can a useful augmentation let alone a viable replacement.

            That may very well happen, but not for a decade ot more.

        7. tarvos
          Boffin

          Re: Nothing new, kinda pathetic really

          The C-sick remain oblivious to the fact that C is directly responsible for Spectre and Meltdown, as noted by David Chisnall: "The features that led to these vulnerabilities, along with several others, were added to let C programmers continue to believe they were programming in a low-level language, when this hasn't been the case for decades."

          https://queue.acm.org/detail.cfm?id=3212479

          1. bazza Silver badge

            Re: Nothing new, kinda pathetic really

            Interesting arcticle, thanks for the link.

            The problem is that, fundamentally, not all programs are efficiently reduceable to a large amount of parallelism; there will always have to be a compromise in hardware between being good at executing sequential code and executing parallel code. And while that compromise has to be made, then there's always going to have to be nasty things to make sequential code run fast. So yes, making C run well has lead to architectures that have Spectre and Meltdown faults, but I'm not convinced that there was ever much choice, even in an ideal world where we were all hell bent on maximum possible parallelism.

            For example, if we went to the absolute extreme and split every if() statement into three parallel threads (one executing the condition evaluation, the two others executing either half of the branch, and join them all up at the end), I'm guessing that we'd run into problems implementing that in silicon at sufficient scale for it to be fast and efficient, even if following an Actor / CSP model such as Erlang. Interestingly, today's sequential instruction pipelines are in effect trying to do this exact thing, but had to take shortcuts to make it fast, leading to Meltdown / Spectre. That ought to be some sort of warning.

            That article by David Chisnall is pretty good; he ends up describing a machine that, "would likely support large numbers of threads, have wide vector units, and have a much simpler memory model". He's basically just described the IBM Cell Processor. No cache, on-SPE-core static RAM, SPEs that were all vector unit, very fast IPC (not that code running on an SPE was called a "process"), the lot. And for those who really knew how to exploit it, tremendous performance could be extracted. It took Intel many, many years before their Xeons could match it for ultimate math performance.

            He's right in that some parallel models are super-easy to use.

            One that isn't is shared memory / resource locking with semaphores. That's not an easy thing to code properly, or implement in hardware.

            Actor model isn't a bad choice, but it has it's own complexities. Sure, it's easy to get 200 threads all up and running and talking to each other, and have it work. What's very, very difficult is proving that you've not then built in the potential for livelock, deadlock. With an Actor model system, it's perfectly possible to have an interconnection of threads where everything works until that one single day when something takes just a tiny bit longer, and the whole thing locks up. There is no amount of testing or analysis you can do with Actor model systems and prove them to be free of such bugs, except for trivial architectures. The moment you have a loop in your architecture drawing...

            A better formulation of the same basic idea is Communicating Sequential Processes. It's basically the same idea as Actor model, but it's different in one crucial regard; sending / receiving a message is an execution rendezvous. A thread won't return from a send until the receiving thread has finished receiving. This makes a world of difference.

            Firstly, if you have accidently architected the thread interconnectedness so that it can livelock, deadlock, whatever, it will happen all the time, every time, in exactly the same way. This means you can spot it happening easily!

            The other difference is that the architecture is now algebraically analysable; indeed, there is a process calculii behind CSP, put together by Tony Hoare in the 1970s, for this very purpose. This makes it possible to mathematically prove that a system won't run into livelock or deadlock problems.

            Personally speaking, I'm a massive fan of CSP and I've used it on some very large parallel systems over the past decades. It's never let me down, and team members have enjoyed doing it. Interestingly, Erlang is effectively a CSP system. Rust and Go also support CSP.

            The biggest problem in making any real difference is, as usual, legacy installed base. For a machine as decribed by Chisnall, it's starting again from scratch with the entire world's hardware, software and firmware ecosystem.

            We very nearly pulled this off in the late 1980s, early 1990s, with the Inmos Transputer. This was a all-out CSP CPU, with hardware threads, not unlike the machine Chisnall describes (minus the vector units). For a while it was looking like the only way to get faster computers was architectures like the Transputer and large collections of them. There was considerable girding of loins, learned articles in Byte magazine, a lot of facing up to the prospect that yes, we were all going to have to adopt CSP / parallel programming to make further progress.

            Two things went wrong. One was, though the CSP model is itself very sweet to use, the development tooling for Transputers was biblically bad, even by the standards of the day. It was catastrophically difficult to debug code in networks of Transputers. This wasn't even CSP's fault, simply that Inmos "forgot" about the need to debug, and didn't put in any sensible means of doing so in the hardware design.

            The other problem was that, whilst Inmos (a typical, underfunded British tech company that tried to do it all itself in house) was messing around trying to get us all to invest in networks of Transputers running at 30MHz (max), Intel then cracked the CPU clock rate problem and put out first 33MHz 486s, then 50MHz, 66MHz, 100MHz, and so on. Suddenly to make performance gains, one could just buy a new PC et voila! The need for parallelism disappeared more or less overnight, and the whole parallel programming movement folded up its tent and went back to sequential programming.

            We've been there more or less ever since, only recently has CSP started to re-emerge in Rust, Go, Erlang. What's crazy about these (and this is really Chisnall's point) is that on today's hardware we might have CSP-architected code running on top of an OS that is written fundamentally around sequential / SMP models of hardware that is in turn running on hardware that synthesizes an SMP environment using cache coherency and inter-core links that (devoid of the SMPness) would actually be NUMA architectures very much like Transputers, or not so very different to IBM's Cell processor. There's now so many layers of abstraction between good coding models and actual execution hardware that it's probably impossible to strip them all out.

            Another problem with Chisnall's proposed machine is that it only really makes ultimate sense if the number of software threads in an entire system is less than the number of hardware threads (though to be fair I don't think Chisnall is suggesting that). That way the thread can be allocated to a hardware thread, and left to run there for as long as the thread persists. That way the hardware thread can be a very simple implementation. This is basically what an SPE in a Cell processor is.

            The problem with there being more software threads than hardware threads is that, then, there has to be context switching, inter-process memory protection, etc and suddenly the hardware thread is now a much more complicated thing, if the context switch is going to be fast. On the Cell processor, you basically had to do the context switch yourself, in software, loading up a different bit of code into an SPE's RAM, then loading up the data for it to execute against, then unloading the results and code for the next piece. You could do some very neat things (e.g. move code instead of moving data), but it was all you, you, you in the source code. This made it tough, but oh so good.

            What Is The Best Thing To Do?

            That's a real poser.

            Those who adhere to the sunk cost fallacy (which is a most people) would say "legacy base is too big, got to stick with what we've already spent a lot of time and effort on". And, given that switching to a better architecture fundamentally means breaking SMP (and all current software), there would have to be some almighty costly carrot / stick to move enough people's opinions. I don't know what that cost is.

            Basically, humanity is not well disposed to the idea of adopting short term pain for long term gain. Just look at the fuss over climate change... So for Chisnall's ideas to take off (which I would like to see), there'd have to be some really bad crunch point not unlike that we were facing in the late 1980s, early 1990s, where progress had really stopped and there was no where else to go.

            1. eldakka

              Re: Nothing new, kinda pathetic really

              Very interesting post, thanks.

              My only criticism is that I don't agree with this statement:

              Those who adhere to the sunk cost fallacy (which is a most people) would say "legacy base is too big, got to stick with what we've already spent a lot of time and effort on".

              Mainly because in this case this isn't a 'sunk cost fallacy' situation. Sunk cost fallacy refers to pushing ahead with what you have because you already spent so much on it, even though throwing it all away and starting again would be cheaper.

              This is not the case. It would not be cheaper to throw it all away and start again.

              Let's take Windows for an example. It'd take a decade and cost billions - 10's of billions - to write a new, ground up version for a completely new architectural paradigm hardware that had the same functional aspect as Windows does today. It'd have to be completely re-written ground up, from scratch, all in one hit, as since the entire paradigm is changing, there could be no, zero, code/library re-use. Not to mention, that assumes you already have in place a skilled workforce who understand this new paradigm and are proficient in it.

              And while you work on this new system, what happens with existing Windows on x86/ARM (current architectural paradigms)? Do you let it stagnate? Just drop it entirely? Or, since it's an active revenue stream as opposed to the new work that won't be a revenue stream for another decade, do you keep working on it, spending the same amount of money on it that you would have anyway if this new paradigm wasn't being worked on? So now you are spending the billions you would have anyway on maintaining/updating/improving Windows, while spending the billions (or tens) on the new paradigm version. And what happens when the new paradigm version comes out, works perfectly to the functional specifications of what is now a 10-year old version of WIndows, because while the project spent a decade making windows for this new paradigm, the existing parallel 'legacy' windows has moved on during those 10 years.

              This will apply to any existing, substantial, software system. DBs (Oracle, DB2, Postgres, etc.), CRM's etc., SAP and friends, Linux (kernel), Office-type apps, Photoshop and friends, and so on. While it may not matter particularly to relatively small software packages (sed, grep, notepad), rewriting from scratch software that has had decades of development - of functional enhancements - will cost a mint.

              So it's not a 'sunk cost fallacy' problem, it's a "how much will it cost to replace these systems (including parallel development while the new systems are developed) versus just keeping upgrading/enhancing/modifying what we'vre already got?", and is it worth it?

              Hey, maybe it is worth it. But it's not a case of "we've already spent X", it's "we have system A, which costs us X/year to keep supporting/improving, to replace it will cost (X + Y) * years", where Y is likely to be some multiple of X. While the resulting product may be better, cheaper, more secure, what's its payback time?

              1. Eclectic Man Silver badge

                Re: Nothing new, kinda pathetic really

                We already know the answer - that is why there are so many 'legacy systems' in organisations that no-one dares to switch off, and which cannot be maintained, but which continue to run on Windows 3.11 boxes (sorry boxen). See for example the recent article on UK government being chastised for not having a plan to upgrade all of these legacy systems.

                https://www.theregister.com/2022/01/21/dwp_1bn_pension_shortfall/

                https://www.theregister.com/2022/01/21/excel_monitor_central_uk_govt/

        8. ssokolow

          Re: Nothing new, kinda pathetic really

          Funny thing about that.

          I use Python as my "shell scripting" language because Bash doesn't do the job.

          Portable? No. There's no consistent place to shebang for bash outside the Linux world and the POSIX /bin/sh subset is anemic. (There was a time when I was trying to support Linux, the BSDs, and MacOS X for a shell script. I have experience there.)

          By comparison, if Python is present (which it is by default on most Linux distros), then you can trust that you're getting CPython and just write for the oldest RHEL or Debian oldstable you've chosen to support... and Python packaging solution have the option to rewrite the shebang for the current machine on install if necessary.

          Footguns? You bet. Much easier to trust that a script will do what you want when arrays weren't a later addition and you've got shlex.split being an explicit thing and subprocess execution only doing string parsing if you explicitly request it be run through a shell.

          Good for quick hacks? Not for more than a dozen lines or so or if you need to account for possible failure cases. `trap` is a major hassle compared to try/finally and, even now that bash has copied zsh's **, it's still not as nice as Python's os.walk.

          ...not to mention how much easier it is to work with various things like regular expressions when you can do things like receive and manipulate your capture groups as structured data without having to remember *another* DSL like awk.

          etc. etc. etc.

          ...and Python can dlopen libraries directly, call D-Bus APIs that have field types not supported by dbus-send or qdbus, and spin up a temporary event loop to block on a D-Bus event if something returns the response to a call asynchronously.

          Bash is looking pretty long in the tooth to my eyes.

        9. OldTimer360

          Re: Nothing new, kinda pathetic really

          I would agree with the nothing new sentiment, is it pathetic well…. For many years I wrote a lot of assembler code (for big iron ) before I learnt C after which it altered my approach completely. Instead of all the messing with everything in low level code I could do most of the work in C and drop into assembler for the real bare metal stuff. This made solutions easier to maintain/improve and even understand, what ever next !

      2. The Velveteen Hangnail

        Re: Nothing new...

        > The problems Rust and Swift 'solve' are all solvable in C and C++ build environments and software methods.

        I disagree. Not because you're wrong per se, but because you're ignoring certain realities of the situation. Yes, C and C++ has structures that alleviate a lot of common issues. The problem is that they are bolted on. You need to know that they are available, how to use them, and then actually use them. And in todays world where more and more people are "self-taught", there is a MASSIVE Dunning-Kruger problem.

        This is why I'm a very strong proponent of Rust as full replacement and successor of C. As a language, it's an absolute PITA to learn specifically because it doesn't allow you to just do whatever the hell you want. It bakes proper coding practices directly into the language. You have to go out of your way to not only write crappy code. Does it stop all developer mistakes? Of course not. That's impossible. But it does do a VERY good job at stopping people from making the most common mistakes that we see today.

        This is the opposite of almost every other language out there where you need to go out of your way to write safe code. And heaven forbid you're using a language that is shockingly insecure by design, like Javascript or PHP. Yes, there have been improvements, but it's not possible to fix the fundamental design failures of these languages. The best you can do is wrap enough security layers around them that you are relatively protected.

      3. FIA Silver badge

        Re: Nothing new...

        The problems Rust and Swift 'solve' are all solvable in C and C++ build environments and software methods.

        Isn't that trying to fix the programmer though, which means you have to fix every programmer who ever touches your codebase, or ensure every environment is setup correctly and uses the same tooling.

        The point of Rust (as an example) is that certain classes of problems don't arise, so you don't need to solve them.

        Buffer overruns, and free after use aren't new issues, they were issues when I started programming in C 20+ years ago, but they haven't been solved, they still happen. strcpy is still a function.

        Just because you can fix these things in C, doesn't make Rust or Swift bad, they're all just tools after all. Why wouldn't you try and improve on them? A decent impact driver doesn't make a spanner useless.

        1. jake Silver badge

          Re: Nothing new...

          "certain classes of problems don't arise"

          One man's problems are another man's clever kernel hacks.

          Not all clever kernel hacks lead to show-stoppers ... but Rust removes some of that capability from the hands of experienced coders.

          Nobody with a brain ever said production kernel coding is a neophyte sport. Nor should it be.

    2. 45RPM Silver badge

      Re: Nothing new...

      I think I disagree with you, but from a religious rather than rational perspective.

      Religiously, I love C, I don’t like programming in other languages, but only because I’m comfortable with it, and I’m closer to the grave than the cradle - so I really don’t want to learn something new.

      Rationally, I know I need to learn something new (so I do) or get out of the game.

      So you’re right - though I wish you weren’t - so have a thumbs up.

      1. ebyrob

        Re: Nothing new...

        I don't know. How many Fortran programmers love Fortran? How many COBAL programmers love COBAL? The fact C is so very pervasive and yet its programmers still love it is very telling.

        I've tried to read some of how Rust works. However certain things never make sense to me in many of these new languages.

        io::stdin()

        .read_line(&mut guess)

        .expect("Failed to read line");

        Why would you chain together operations for input? Doesn't it make more sense to split complex problems into small independent pieces? If anything this looks a lot like an over-loaded function with optional parameters in C++.

        So, OK. These new languages are there and OK and all that, but I don't think any of them are "better" than C. (Even C++ really isn't better.) In some ways they have advantages that C can / should never have. Garbage collection, memory safety, true exceptions. But none of that would ever have been possible in the 1970s, and such features have significant overhead.

        C really is a great language. There are some others too. Swift and Rust are probably still too new to really tell how good they are.

        As to OS calling conventions. If there's a better way to call functions across module and even program boundaries than C API, we should probably try to use it. I'm not sure how "Result" is going to cross language and environment boundaries though (or indeed even exceptions don't do that so well).

        1. heyrick Silver badge

          Re: Nothing new...

          C's power is not that it is more capable, it's that it is fairly easily understood and that it is ubiquitous.

          Granted, there are a few things that might trip you up if you're doing a lot of cross compiling - we live in an age of 8 bit and 16 bit (simple microcontrollers), 32 bit (older kit), and 64 bit (newer kit) and when 128 bit rolls around, will we have long long longs?

          However more than anything else, the power of C can be demonstrated by Linux. Literally, if the processor is capable (and documented), then Linux exists for it.

          If Linux was written in some trendy new language, would it exist damn near everywhere? And would the API be as restrictive as the C based one, only "good for that language if not everything else"?

          The reason that you're stuck with a C interface is because that's what the OS is written in. Before that, the OS API was often something that worked best with assembler...

          1. Roland6 Silver badge

            Re: Nothing new...

            You raise an interesting point.

            >The reason that you're stuck with a C interface is because that's what the OS is written in.

            Whilst that is true, it is also not the whole truth.

            Prior to Unix, other OS's did provide compilers, API's and libraries for other high-level languages such as COBOL and FORTRAN. ie. the OS authors didn't expect application programmers to use the same language as them. This is probably in part due to where Unix came from, compared to where OS/390, VMS etc. came from.

            The current state of Linux and Unix owes much to people sticking with Unix as originally defined and not upgrading it into something that supported the usage of application programming languages different to the systems programming language it was written in. So in some respects it isn't 'C' that has aged or been out-grown, but the worldview behind Linux/Unix as they currently stand.

            >If Linux was written in some trendy new language, would it exist damn near everywhere?

            Well remember the beauty of Unix was that you only needed to right a small amount of assembler to be able to port the C compiler on to your platform and then compile Unix for your platform. So I suggest if your trend language and OS is as portable as C/Unix then the potential is there

            1. jake Silver badge

              Re: Nothing new...

              "So in some respects it isn't 'C' that has aged or been out-grown, but the worldview behind Linux/Unix as they currently stand."

              It's not the language or the OS that is the problem, rather it is the expectations of the humans using the tools. Kids today expect, nay DEMAND that the computer does all the thinking for them. Any language that requires you to pay attention to what you are doing is "hard", and hard is bad because computers are supposed to make life easy.

              Sadly, easy is as easy does ... programming a kernel or a compiler isn't supposed to be hard, it actually IS hard! No amount of whining is going to ever change that.

              I place the blame squarely on Apple, with its "ease of use" myth.

          2. jake Silver badge

            Re: Nothing new...

            "we live in an age of 8 bit and 16 bit (simple microcontrollers), 32 bit (older kit), and 64 bit (newer kit)"

            Each of which (at least on my personal systems) have their own, individually kitted out environments geared to making my life easier.

            "and when 128 bit rolls around"

            It's here, and has been ... The IBM System/360 Model 85 could handle 128-bit floating point arithmetic back in 1968.

            "will we have long long longs?"

            At DEC, the VAX line called 'em octawords and HFLOATs, using four consecutive registers ("four longwords").

            From little acorns ...

        2. doublelayer Silver badge

          Re: Nothing new...

          "I don't know. How many Fortran programmers love Fortran? How many COBAL programmers love COBAL?"

          A lot of them. Whenever Cobol gets brought up here, many people remark on how much they liked it, how easy it was to read, how much they wish people still used it today. I don't know because I'm too young to have written it. People seem to like the things they've used enough because they already know how to use it. I'm guessing that, if people didn't like it, they did everything they could to switch to an alternative. This leaves us with many choices that some group of people really like, but not necessarily choices that have been designed well. There's a reason that Cobol isn't much used except in old systems today, and it's not because nobody liked it.

          1. yoganmahew

            Re: Nothing new...

            Every language pales next to 360 Assembler. Lick the darkside, you'll never go back to someone else's light.

            1. Danny 2

              Re: Nothing new...

              @yoganmahew

              It's funny because it's true! Every electronics engineer could code in assembly, nd so C was the only place to go when hardware was outsourced to Asia.

              In the late 90s my twentyish peers were talking in hushed tones about an older deceased engineer who could read hex code. When I was their age everyone could, I still could, but I thought better of boasting about that as I was trying hard to fit in. "Yeah, how about that JavaScript and BritPop?"

              I've disliked young people since I was young.

              1. yoganmahew

                Re: Nothing new...

                @Danny 2

                LOL. Yeah, I always hung around with the older crowd and learned to read hex code (still do). I regret it a little now, they're all dying off...!

        3. jake Silver badge

          Re: Nothing new...

          I don't actually love any particular programming language ... but I do like COBOL and Fortran. And I generally reach for C (and sometimes a little assembler), because I know I can get the job done in it. Likewise BASH and perl.

          If I were to code for a more commercial line of software, or much larger projects, I might use other options. Maybe.

        4. captain veg Silver badge

          How many COBAL programmers love COBAL?

          I imagine all of them, since it must be an extremely small niche.

          No idea how many COBOL programmers love it. Since, so far as I can tell COBAL is not, in fact, a programming language, I would venture that it's none of them.

          -A.

          1. jake Silver badge

            Re: How many COBAL programmers love COBAL?

            captain veg writes:

            "so far as I can tell COBAL is not, in fact, a programming language,"

            PDNFTT

            1. captain veg Silver badge

              Re: How many COBAL programmers love COBAL?

              > PDNFTT

              COBOLLERS.

              -A.

        5. bazza Silver badge

          Re: Nothing new...

          @ebyrob,

          "So, OK. These new languages are there and OK and all that, but I don't think any of them are "better" than C. (Even C++ really isn't better.) In some ways they have advantages that C can / should never have. Garbage collection, memory safety, true exceptions. But none of that would ever have been possible in the 1970s, and such features have significant overhead."

          C does have very simple garbage collection, driven entirely by variable scope and implemented at compile time by the compiler. The rest is up to the developer to work out.

          Rust is no different, it's just that it has a much more sophisticated and extensive syntax for describing a variable's scope and ownership which also pays dividends when it comes to multithreaded applications. It's more work for the compiler, less work for the developer.

          The only things that stopped something like Rust existing in the 1970s was 1) imagination, 2) having a machine big enough to run a compiler able to fully process the much more complex ideas of scope. But there's nothing about the resultant object code that would fundamentally have prevented execution on hardware. You don't get some mamouth GC thread running alongside your code.

          C really is a great language.

          I'm a C fan, always have been, but it really isn't a great language any longer. It's a bit like an old classic car - great in its day, fun, tempramental and needing a lot of old knowledge and expert care to properly look after it, but not the one to regularly take on the commute, nor the weekly shop. The modern car will go a lot further, faster, with less discomfort using less fuel.

          Consider all the memory pitfalls that can catch you out in C, and then consider the productivity of the educational establishment teaching a language that is devoid of those problems, and the output of a team using in real life.

          1. Roland6 Silver badge

            Re: Nothing new...

            >Consider all the memory pitfalls that can catch you out in C

            Remember C limits the extent to which you can 'abuse' the instruction set, for real tightrope code yes need assembler.

            I remember having to squeeze some functionality into 19 bytes - that was all that was spare in the EPROM, I had got it down to 23 bytes and then discovered a flag that was barely documented in the Intel ASM documentation and wasn't used by PL/M, using this my code dropped to 17 bytes... Whilst I left clear comments in the source about the feature I was using, I didn't envy the person who would have the task of maintaining that piece of code.

            1. david 136

              Re: Nothing new...

              My god, what feature was that?

              Why not encode the instruction as byte constants referring to processor documentation.

              "This code can't be expressed properly in the assembler, but per XXX.YYY, this encodes a <whatever> instruction that is supported by the architecture.

              1. Roland6 Silver badge

                Re: Nothing new...

                I think you misunderstood, the code could be expressed properly in the assembler, but not in PL/M, which was used for much of the system. However, even in assembler, the warning was don't take the function the opcode is performing on face value, read the manual and understand what it is doing with this flag. Obviously, the comments against the assembler re-inforced what behaviour was being triggered by this seemingly mismatched sequence of opcodes.

                Interestingly, because this code was a key part of a safety-critical infrastructure (people would die if it failed), I used it to explore C.A.R.Hoare's ideas on formally proving the correctness of programmes and thus gained first hand experience of the assumptions and limitations of the method. So in addition to the code there were several dozen pages of formal working that explained in detail the multiple intertwined logic threads in the code.

        6. ssokolow

          Re: Nothing new...

          I's not inherently an API built around chaining. In fact, you probably wouldn't use it that way in anything but simple examples.

          io::stdin() returns a handle for stdin which you'd normally want to store in a variable, similar to something like this line that's burned into the brain of anyone who went to school after Java started to see uptake:

          BufferedReader in = new BufferedReader(new InputStreamReader(System.in));

          .read_line(&mut guess) is a method on the handle so it can use the implicit "self" argument rather than requiring you to manually pass it in like fread's FILE *stream. However, unlike in C++, that's just syntactic sugar and it could also be called as std::io::Stdin::read_line(&stdin_handle, &mut guess).

          read_line returns a Result<usize, std::io::Error>, which is a tagged union containing either Rust's equivalent to size_t (the number of bytes read) or an error type which wraps the output of ferror() or equivalent in a platform-agnostic representation to have portability-by-default on the 99% of cases that don't need to unwrap the raw response contained within.

          .expect("foo") is a method on Result that's basically "ASSERT that the union contains the Ok variant and return its contents". Like ASSERT, it's meant for "this expectation should never be broken" situations.

          In the real-world, it'd be more likely that you'd use .read_line(&mut guess)? where the "?" operator either returns the Ok variant or performs an early return of the Err variant, like a more explicit form of letting an exception unwind the call stack.

          ...so this...

          io::stdin()

          .read_line(&mut guess)

          .expect("Failed to read line");

          ...isn't fundamentally different from calling fopen, then fread, then ferror. They just decided to require you to call a function to get your stdin handle rather than making it a global, and to use methods rather than free functions. (Partly because it makes it easier to group things in the API docs.)

          If anything I find more people coming from GCed languages and being surprised that Rust made the C-like decision to use an out parameter for the retrieved line so the buffer can be reused rather than returning a new one each time.

    3. Arthur the cat Silver badge

      Re: Nothing new...

      I think it's time to exit stage left and leave some space for the newcomers

      Considering that we use a time system that originated in Babylonia 2.5 millennia ago it may be a while before that happens.

      1. jake Silver badge

        Re: Nothing new...

        Considering the base units of the human concept of time (the day and year), I rather suspect it originated quite a few millennia before that ... Certainly by the advent of farming, some 12,000 years ago, it would have been something that most people paid attention to.

        The manufacture of bread (~14.5Kya), beer (~14K ya), cheese (~10Kya) and mead/wine (~8Kya) would have started the naming of multiple day-units and sub-day units, at least among the set of people producing these items. Yes, you can ferment things for human consumption without a clock or calendar, but getting the timing right nearly always helps.

    4. Anonymous Coward
      Anonymous Coward

      Re: Nothing new...

      It's a great language for generating programs with a guaranteed need for perpetual maintenance and for locking up understanding of an esoteric problem space in a cloud of obfuscation. C programmers may not be able to earn as much as higher level languages but they can sure make up for that in self-created demand. It's perfectly feasible today for a professional programmer to retire without ever having so much as glanced over the fence at anything newer or nicer.

    5. Dagg Silver badge

      Re: Nothing new...

      Even worse is the realization that hardware has bent to the whims of almighty C.

      Wrong, C was based on the machine code and hardware for a PDP/11.

      Could be worse, could be based on the machine code and hardware from on old (stackless) IBM mainframe like COBOL.

    6. Anonymous Coward
      Anonymous Coward

      Re: Nothing new...

      There was a time when it was necessary for an OS to be CPU specific and written in machine code/assembler.

      This was when ardware was expensive and to get it to do something saleable required real optimisation.

      Acorn for one spurned the idea of using anything other than assembler for OS dev both for the reasons in the article but also because their developers could actually do a better job with assembler than using any alternative.

      Since those days hardware has become cheap in terms of performance per unit currency with the result that the "need" for developers of the quality of old has disappeared, beyond where c type languages just don't cut it, for example 3d gaming engines.

      So yet again here we are talking about what can be done to improve performance and security of computing when for decades the industry has been saying that the quality developers of old are unecessary, I still find this ironic.

      The answer remains that you have to go back to move forward. Accept the concept that yet another high level language is'nt going to address the elephant in the room - the 8086 was crap from day one and even with all the money spent to up it's performance it is still crap if you want the best software performance then it requires programming at the lowest level anything else is catering to the lie that programming is something that anyone can do well and is doomed to bite you where it hurts later on.

    7. Steve Channell
      Boffin

      It's not C it's linkage conventions

      Every "foreign function interface" from Rust to Haskel and everything in between has to use "extern C" calling conventions (other than z/os), and pass values via the stack, the called function has little more than an obligation to respect the immutability of parameters and exception handling protocols.

      strictly speaking, most "extern C" calling conventions are stdcall (Pascal) rather than cdecl (C calling with variable number of parameters). fastcall and thiscall are C++ extensions that pass some parameters by register.

      The exclusion is microkernel operating systems that convert stack frames and exceptions to messages. The problem here is that the microkernel OS {Windows NT, Darwin, MkLinux, Redox-OS, etc} (hybrid kernels with OS services in ring-0) haven't stopped exploits, despite runtime cost - it's the lack of immutability in application code rather than calling convention that causes most exploits.

      If we re-write the OS in Rust like Redox-OS and applications in Rust or managed languages, the problem goes away - they shouldn't bitch about C when it's not the problem PASCAL is.

  2. Paul Crawford Silver badge

    "There is no problem that can't be solved by another layer of abstraction - except for the problem of too many layers of abstraction."

    1. b0llchit Silver badge
      Facepalm

      Just like XKCD's standards. We should write one new language to rule them all... and then we just have one more. We make one more abstraction to abstract it all... etc.

      Maybe we should teach the next generation to use the appropriate tool for the particular job and adapt to a heterogeneous landscape that should be embraced instead of cursed.

      1. R Soul Silver badge

        "We should write one new language to rule them all"

        We tried that in the 1980s. The result was Ada and it didn't have a happy ending.

        1. Arthur the cat Silver badge

          The result was Ada and it didn't have a happy ending.

          Especially not for the first Ariane 5.

          1. Red Ted
            Mushroom

            I've given you an upvote for humour, but the real issue was of regression testing and input sanitisation.

            Ariane 5 reused a system from Ariane 4, but a parameter that it measured was greater on the new bigger/faster/etc. rocket, so when it type converted a value before sending it to another system it eventually overflowed and generated an error message instead. The receiving system tried to parse the error message as if it was the correct output and came up with a random number, which was then fed in to the flight control parameters and it all went horribly wrong!

            1. yoganmahew

              Typing is a mistake. You should always be suspicious of external data. Mostly you should be suspicious of internal data too. If you don't understand the variable on the previous line of code, you should be suspicious of that too.

              1. adam 40 Silver badge

                I am suspicious of your typing...

        2. Vometia has insomnia. Again. Silver badge

          The optimism was enough that I'm still almost sold on the idea 35 years after first using it and keep intending to go back to it any day now. But until that day, I still mostly use C, or Bash or whevs.

        3. Someone Else Silver badge

          "We should write one new language to rule them all"

          We tried that in the 1980s. The result was Ada and it didn't have a happy ending.

          Especially since Ada was supposed to the one true language to rule (depending on who you ask) C, or PL/1 (which was itself the one true language to rule them all before the next one true language to rule them all...)

          Whew! I got dizzy there for a second...

        4. jake Silver badge

          "The result was Ada and it didn't have a happy ending."

          Sadly, it hasn't ended yet.

          1. Anonymous Coward
            Anonymous Coward

            But you will be paid very very well if you know how to program in Ada.

    2. Arthur the cat Silver badge

      There is no problem that can't be solved by another layer of abstraction

      I always liked the bit in Vernor Vinge's A Deepness in the Sky where the Qeng Ho use "software archaeologists" to handle the multiple layers of software right down to the bottom which is "one of Humankind's first computer operating systems" with an epoch that's "actually about fifteen million seconds later" than the first time a human set foot on the Moon.

      1. gormful

        I wish I could upvote this comment multiple times for calling out the best SF novel ever written.

        1. Liam Proven (Written by Reg staff) Silver badge

          It's got to be one of them. :-)

        2. Eclectic Man Silver badge

          SF Novel

          Still on sale. Based on your recommendations I have just ordered a copy (from Waterstones) because for some reason I feel in need of some escapist fantasy Sci-Fi ishness right now*.

          *I have a cough, a suspicious new skin 'lump' and have been booked in for a blood test, a chest X-ray and a doctor's appointment, plus the news of Brexit, Covid, energy prices, global warming and the wars / Special Military Operation** in Ukraine, Yemen and the situation in Afghanistan are getting me down.***

          **Seriously, what is the difference?

          *** In the UK go to dec.org.uk to make a donation.

          1. fajensen

            Re: SF Novel

            ... All the misery of modern existence - accompanied by the screeching chorus of the covidiots about how lockdowns is wut made us depressed!

            I hope your doctors appointment turns out to be a waste of their time!

            1. Eclectic Man Silver badge

              Re: SF Novel

              fajensen: "I hope your doctors appointment turns out to be a waste of their time!"

              Thanks.

              The doctor did say it* was not dangerous and might drop off all by itself, so I think that was a good thing.

              Not heard back for the chest X-ray or the blood test yet though, so fingers crossed.

              *(Some sort of skin tag, probably. What were you thinking of?)

      2. Anonymous Coward
        Anonymous Coward

        Genuinely thanks

        That's on my reading list now.

        You have staved off the creeping ennui of these, the lost years, for at least another couple of weeks. (Amazon has been pissing me off lately so I will have to wait a bit for shipping).

        Also thanks to the reg community for reminding that I hadn't read Iain Banks "Use of weapons" which made the first lockdown pass a little easier.

        1. jake Silver badge

          Re: Genuinely thanks

          It's in the Barrister's bookcase over my left shoulder ... A couple years ago, the Wife etched the top glass panel with the words "In case of retirement, Break Glass" in a rather lovely embellished copperplate.

          1. Neil Barnes Silver badge
            Coat

            Re: Genuinely thanks

            Over your left shoulder? In my bookshelf, it's down in the bottom right with all the other Vs :)

            1. Dave 126 Silver badge

              Re: Genuinely thanks

              > Over your left shoulder? In my bookshelf, it's down in the bottom right with all the other Vs :)

              Then we shall explore the possibility that he was referring to Banks Iain M and not to Vinge Vernor! :)

              Of course nor can we ignore the possibility that he has only the one bookshelf.

              He might also be buying his science fiction novels alphabetically, and that Asimov fella was nothing if not prolific.

              ( Hmm, has anyone written a script to take their Kindle library, trawl the web for the appropriate book cover jpgs and use said images to generate a virtual bookshelf backdrop for zoom meetings? Ah, no, probably not - the spines of books aren't usually displayed on Amazon (other book and ebook vendors are available))

              1. Roland6 Silver badge

                Re: Genuinely thanks

                Cultural assumption that everything has to be A-Z left-to-right and top-to-bottom.

                I suspect given the number of 'interesting' authors with surnames in the later part of the alphabet, organising your books Z-A in a left-to-right and top-to-bottom structure, means that people scanning your bookshelf would see all these authors before they see the obvious ones: Asimov, Banks, Clarke etc.

                1. Eclectic Man Silver badge

                  Re: Genuinely thanks

                  There is a possibility that he actually has some non-fiction books on the same book case, and therefore his books are organised in sections, rather than alphabetically.

                  I picked my display name as Eclectic Man because I have a range of books including 'The Female Eunuch', 'Where the Wild Things Are', 'In Search of Lost Time', 'The Authoritarian Dynamic' (excellent), 'Peeling the Onion' (Gunter Grass, also excellent) and many, many photography books*, as well as history, mathematics, philosophy 'popular' science etc. and, of course teh excellent book 'Snoop' by Sam Gosling**about what your taste in books and music says about you-

                  *Mostly landscapes and wildlife, some reportage, and a few arty ones.

                  ** 'Snoop' by Sam Gosling, ISBN 978-1846680182

      3. david 136

        You just sold a few.

      4. Eclectic Man Silver badge
        Happy

        Thanks

        for recommending the book. Just finished reading it, and it is excellent.

    3. Flocke Kroes Silver badge

      I prefer ...

      Buffy: There is no problem that cannot be solved by chocolate.

      Willow: I think I am going to barf.

      1. Dave 126 Silver badge

        Re: I prefer ...

        Childhood obesity?

        1. david 136

          Re: I prefer ...

          rationed as a reward for the salad.

        2. claimed Bronze badge

          Re: I prefer ...

          Dangled on a string from a drone doing laps round a track, boom

          1. stiine Silver badge
            Holmes

            Re: I prefer ...

            I used to chase my college girfriend around the track while she was doing laps. And I was 10 years older and, at the time, a 3-pack a day smoker.

  3. Anonymous Coward
    Anonymous Coward

    Bad definition

    "for almost any program to do anything useful or interesting, it has to run on an operating system"

    No, it doesn't. In the old days, I wrote plenty of programs in C which, as the first action after starting, deliberately disconnected MS-DOS from all the machine's interrupt vectors so that my program and the BIOS could work without MS-DOS getting in the way.

    1. CommonBloke

      Re: Bad definition

      Hence the -almost-. And a program or game taking over the whole hardware/shutting down the OS was already rare some 20 years ago

      1. Jonathon Green
        Boffin

        Re: Bad definition

        I’d suggest that you might have this the wrong way round. The world does not end at the door of the data centre, the edge of the desktop, or the vanishingly thin bezel of your tablet or phone. There is a whole lot of code running in washing machines, heating appliances, and in anonymous looking boxes in utility cupboards which either runs on bare metal or on an RTOS so vanishingly thin that you’d barely know it was there.

        C is still very popular for embedded stuff, and it’s not just because it annoys the younglings…

        1. Red Ted
          Go

          Re: Bad definition

          I realised a few years ago that the work I do in C on embedded systems is much more like "engineering" rather than "computer science".

          In the same way that the tools in an workshop will be familiar to an engineer of thirty years go, the tools I use have barely changed in the last couple of decades. Sure, the IDEs and debug interfaces have improved in that time, just as there have been improvements to lathes and milling machines in the same time.

        2. Joe W Silver badge

          Re: Bad definition

          I write Fortran (not FORTRAN) to annoy the younglings...

          1. jake Silver badge

            Re: Bad definition

            I don't use COBOL, Fortran, C, perl and bash to annoy anybody ... I use them because they do the job I ask them to do quite nicely. Annoying the younglings is just gravy.

      2. jake Silver badge

        Re: Bad definition

        Rare? 20 years ago? Shit, it;s not rare today!

        I can think of any number of bits & bobs that require their own special code running on bare metal in order to operate, from the clock on the wall (synced to the Master Clock downstairs), to the equatorial mount under the telescope in the corner, to the garage door opener and it's remote, to the greenhouses in the distance and the gate to the main road beyond that. Need I go on?

        1. adam 40 Silver badge

          Re: Bad definition

          Downvoted because you didn't go on long enough!

    2. iron Silver badge

      Re: Bad definition

      I've written useful, interesting and more importantly code I was paid to write that ran directly on processors like the 6502, PIC processors, etc. None of them had even the slightest whiff of an operating system.

      1. Neil Barnes Silver badge

        Re: Bad definition

        Indeed. I have written code on bare processors - both in assembly and in C - not because of the language but simply because the processors could survive the physical environment in which the code had to run.

    3. Someone Else Silver badge

      @grizewald -- Re: Bad definition

      Good point. Let me add the numerous programs written in C (and C++!) that were literally written for and on a bare embedded system board.

      1. adam 40 Silver badge

        Re: @grizewald -- Bad definition

        Not to mention writing the BSP for a completely new board so you can bring it up and then run the O/S.

    4. Anonymous Coward
      Anonymous Coward

      Re: Bad definition

      While I think the others in this reply chain point out the very good point that there is a lot of good work being done sans OS in the embedded world, I think that is a stronger case than what we did in "Ye Olde Tymes" (and yes, I remember those days, and the only thing really I miss was being able to warm boot between dos windows and a much older Linux kernel without waiting for the system to POST every time.)

      You aren't going to see people running raw like that much past the microchip level anymore. Partly because with all the I/O hardware, storage, multi-threaded multi-core cpus, etc that writing or inlining all the code to leverage a modern motherboard makes bypassing the OS an exercise in futility.

      So while the other posters definition is incomplete, it's on the mark for scope they discussed it for.

      1. jake Silver badge

        Re: Bad definition

        "You aren't going to see people running raw like that much past the microchip level anymore."

        We didn't run raw like that much past the microchip level back then, either. Simple I/O, to be sure (how else are you going to control a greenhouse?), but other than that ...

        Availability of multi-core CPUs and gigabytes of memory doesn't mean one has to make use of them, you know. Sometimes less is better ... and much, much less is much, much better. Why on Earth would I want (for example) Microsoft Windows on/on my fridge or coffee pot (or in my car)? Far too much to go wrong with what should be a simple user interface.

        1. Libertarian Voice

          Re: Bad definition

          That is the reason why I elected to write an interface using ncurses some 7 years ago that remains an ongoing project. Yes it sits on an operating system, but is far more stable and more compatible than anything else I have seen. I even have a web interface written in c/c++ (cgicc) in order to reuse the libraries that I wrote for the back office. The maintenance benefits over using whatever is deemed the latest and greatest are huge.

    5. jake Silver badge

      Re: Bad definition

      When you consider that MS-DOS isn't actually an Operating System (it is just a glorified program loader), your comment makes even more sense.

    6. This post has been deleted by its author

  4. Electronics'R'Us
    Holmes

    Some incorrect assumptions

    The new argument goes something like this: for almost any program to do anything useful or interesting, it has to run on an operating system.

    That really isn't so; there are hundreds of thousands (quite possibly more) of microcontrollers running bare metal and doing very interesting things. Even where an OS is desirable, many microcontroller architectures (including ARM based devices) map very neatly to C. I know that from several dozens of projects where they were a mixture of bare metal / various OS and the code was written in C simply because it was the best choice for the task.

    That also debunks the myth that C hasn't matched (bare metal) processor architectures since the 1970s; it demonstrably does so to this day. I am not saying it maps to all processor architectures because it doesn't at the bare metal level.

    I would be the first to admit that C has its issues (as does every language) and there are parts of it to steer clear of especially if portability for either the target architecture or compiler is important, but the use of a decent set of rules avoids the vast majority of problems.

    There are places where C really is the best choice and others where a different language may be a better fit, but to try and argue that it is not a programming language is rather over the top and smacks of 'language elites'.

    There is no single programming language that is appropriate to every task.

    C is widely used for a lot of reasons; every language choice is a trade-off for a given project.

    1. Tom 7

      Re: Some incorrect assumptions

      Not seen the Barr rules before. Over the years I built up a much larger set of rules which I found useful and could write bits of code to check for. I was playing with RHide on Freedos the other day getting all nostalgic and then found Motor which I might just have a play with to backfill with the old things I used to use on C/C++ code that made it pretty much bulletproof- if you can save a piece of code with potential error in it (and automatic idiots guides) then you're at least somewhere a bit safer.

      1. Paul Crawford Silver badge

        Re: Some incorrect assumptions

        MISRA is the obvious one, based on automotive safety concerns, and then the is also the ones drawing up for the Joint Strike Fighter systems:

        https://www.stroustrup.com/JSF-AV-rules.pdf

        Another good overview and guide comes from the Numerical Recipes books, they cover many things but mostly it is about being consistent and readable. If only programmers could start with that!

    2. bombastic bob Silver badge
      Happy

      Re: Some incorrect assumptions

      I looked at the link in your post under 'decent set of rules' and gave it a quick glance. Not only did it advocate the use of Allman style indents and braces, it also said "The tab character (ASCII 0x09) shall never appear within any source code file.".

      It has my approval!

      (that way the editor or merge tool or viewing tool you use does not affect the appearance of the code)

      1. Electronics'R'Us
        Thumb Up

        Re: Some incorrect assumptions

        I have used Allman style indents and braces for over 30 years.

        It is less prone to bugs (ymmv) and in many editors the matching brace (being at the same indent level) can be clearly highlighted.

        Just one of the things I like about the Barr group coding standard.

        1. Will Godfrey Silver badge
          Thumb Up

          Re: Some incorrect assumptions

          One of the first things I did with the code I'm involved with. Previously it had just about every kind of brace/indent style you could think of - including the idiocy of multi-statement lines producing walls of dense text :(

    3. Anonymous Coward
      Anonymous Coward

      Re: Some incorrect assumptions

      Though later in the part of the article it also makes it clear what they were saying was that C is MORE than just a compilable language, not that if failed to meet the bar to be considered one. That it's ubiquity and scope made it the de facto architecture of the guts of the operating systems we rely on in the last 50 years, for better and for worse.

      That said, the Venn diagram of "parts of an OS best written in OG C" is now a pretty small part of the much bigger circle of "OS". Too much parallelism, too many different inputs, interrupts and WAY to much mixed memory I/O. People have been shooting themselves in the foot trying to wrangle this stuff cowboy style for to long, and I am sick of having to get up in the middle of the night and install the patches for their mistakes. I'm not sure if I'd even write a driver in plain C anymore, if I was starting from scratch.

    4. fg_swe Bronze badge

      C Too Widely Used

      We have loads of application-level systems such as web servers, browsers, email servers which should better be implemented in a memory-safe language. It would sterilize about 70% of exploitable bugs.

      See http://sappeur.ddnss.de/Sappeur_Cyber_Security.pdf, last page.

      1. ssokolow

        Re: C Too Widely Used

        ...or What science can tell us about C and C++'s security by Alex Gaynor, which gives a list of cited examples of that ~70% figure across a bunch of big companies that have put a lot of time, energy, and money into trying to move it.

        (eg. the Chrome team's Safer Usage Of C++ and Borrowing Trouble: The Difficulties Of A C++ Borrow-Checker if you want specific examples from after they concluded that their efforts for better training, hiring, and static analysis weren't moving the needle.)

    5. danmcb

      Re: Some incorrect assumptions

      Absolutely. That line in itself expresses the somewhat sheltered world that this author. And the correspondingly limited mindset they embrace.

    6. adam 40 Silver badge

      you can't do (for i=1; i<10; i++)

      so they can knob off.

      1. ssokolow

        Re: you can't do (for i=1; i<10; i++)

        "for i in 1..10" isn't good enough?

  5. VoiceOfTruth Silver badge

    Aria Beingessner

    -> 'Beingessner should know. They've'

    She's more than one person?

    1. Anonymous Coward Silver badge
      Paris Hilton

      Re: Aria Beingessner

      We are not amused

      1. VoiceOfTruth Silver badge

        Re: Aria Beingessner

        Surely that should be 'we is not amused', or 'I are not amused'.

        1. Paul Crawford Silver badge

          Re: Aria Beingessner

          The royal 'we'

        2. Anonymous Coward
          Anonymous Coward

          Re: Aria Beingessner

          I are baboon.

    2. LionelB Silver badge

      Re: Aria Beingessner

      Or, as we might say in South London: "VoiceOfTruth, is you more than one person?"

      1. Doctor Syntax Silver badge

        Re: Aria Beingessner

        "is you more than one person?"

        Fair question as we don't use 2nd person singular pronouns any more.

        1. LionelB Silver badge

          Re: Aria Beingessner

          Don't you? Apparently I do, though it's hard to tell.

          Anyway, I was thinking more about the S. London conjugation of the verb "to be" (e.g., a twat)

          Singular:

          1. I is a twat

          2. You is a twat

          3. He/She/They is a twat

          Plural:

          1. We is a twat

          2. You(se*) is a twat

          3. They is a twat

          usually appended with the (gender-neutral) "... bruv".

          *Possibly an Irish/Northern-influenced variation.

          1. Eclectic Man Silver badge
            Headmaster

            Re: Aria Beingessner

            According to the poet Dylan Thomas, the greatest poem in the English language is the present tense of the verb 'to be'.

            I am.

            Thou art.

            She is.

            He is.

            We are.

            You are.

            They are.

            Richard Burton recites it superbly: https://www.youtube.com/watch?v=fDNCEp8Utjo

            (Pedant icon because of "Thou art" rather than the more frequent "You are" for the second person singular.)

    3. Mike 16
      Joke

      Oh, _that_ rabbit hole

      One of my favorite entries in the controversy is

      http://itre.cis.upenn.edu/~myl/languagelog/archives/003572.html

      but this is a mere sip from the well. I'm sure your [1] google-fu will be able to find many more, and some are well written and funny.

      In fact, Language Log is a well of information and (usually) friendly debate on matters of language. A respite from editor wars and pleas for "A computer language that will let me do anything but make a mistake".

      [1] All y'all's (just in case you're are a hive mind)

      Joke icon "just in case", but the whole subject does sometimes remind me a bit too much of the Babelfish wars

      1. Someone Else Silver badge

        Re: Oh, _that_ rabbit hole

        My Tejano friend told me while trying to teach me Tejano:

        1) "Y'all": second person singular

        2) "All, y'all": second person plural

        3) "All Y'alls": second person plural possessive. Can be written with or without an apostrophe before the final 's'

        Don't get me started on the Chicago "youse"...

      2. Doctor Syntax Silver badge

        Re: Oh, _that_ rabbit hole

        I think the explanation is a little more complex than just third person usage. Instinctively, as someone who has been using English for more decades than I care to think about* the axis appears to be about more than singular vs plural. It's also informal vs formal, intimate vs impersonal and definite vs indefiinite.

        Referring to oneself the usual pronoun is "I" but formally it can be "we". Hence the "royal we" for proclamations although it can be used in non-royal legal usage. It can even be used in less grand situations than that: habitually on cooking programmes a chef will explain what "we" are going to do although maybe that may be a case of not adapting to working solo instead of with a team. It's also not unknown for someone caring for a sick child to explain that "we" have not been feeling very well.

        As regards 2nd person the rules for thou/you were (still are if you want to use them) exactly the same as tu/vous in French. The Yorkshire rule as said by a senior to a junior is "I can thou thee but don't thee thou me". I'm not a linguist but I gather German is even more complex.

        As to third person I can't better the example someone gave on an earlier thread: "See who's at the door and find out what they want.". I agree this new usage can be a bit jarring but on the other hand, as a male, it's good to have my pronouns back: females had she, etc. to themselves but we blokes had to share our gendered pronouns with the general case.

        English has cut down the complexity it seems to have inherited from its Indo-European roots but don't let's lose all the subtlety.

        * and was brought up in a time and place where the 2nd person singular was in use.

        1. jake Silver badge

          Re: Oh, _that_ rabbit hole

          "It's also not unknown for someone caring for a sick child to explain that "we" have not been feeling very well."

          To be fair, as any parent dealing with under-the-weather sprog can tell you, the "we" is usually a fully justified plural.

    4. fg_swe Bronze badge

      "they"

      I assume this is the Left Coast madness who claims that "genders are a social construct".

      1. Dave 126 Silver badge

        Re: "they"

        How can genders possibly be independent of society? If I were the last human on Earth I wouldn't care which rock I pissed behind, or what colour clothes I wore - as long as they made me less visible to bears.

        1. Anonymous Coward
          Anonymous Coward

          Penises, Breasts And Vaginas

          ...are a very real thing. Motherhood is also very real different from Fatherhood. Men and Women have complementary strengths and weaknesses. In many different ways from physical to the soul.

          Only the victims of LUBJANKA and BEIJING will believe otherwise and be for the destruction of the word man, woman, mother and father. Outside of Russia and China, of course. Inside, both Russians and Chinese have no use for such craziness.

      2. Dave 126 Silver badge

        Re: "they"

        Oh, and don't go thinking any study of science, be it medical, zoological, genetic or evolutionary will support your view.

        We're all the result of billions of generations of societies, creatures and genes, all playing games of mating, cooperation and competition at multiple levels in non-constant and adaptive environments. How the hell do you expect *anything* that results from that to be clean cut and simple?

        The madness is in ever thinking otherwise. That so many people try to is partly a social construct itself, though of course not aided by our brains' capabilities (brains which are themselves the result of the above glorious mess).

        1. Eclectic Man Silver badge

          Re: "they". - Gender

          The association of linguistic 'gender' with biological sex is not universal. The Zulu language has many non-sexual genders:

          "Zulu, a Bantu language spoken mostly in South Africa, also has an agreement class system, usually called a noun class system, but that system is much more elaborate than its European counterparts and is based on neither sex nor animacy. Fourteen of this language’s sixteen noun classes have prototypical members which are non-human, making the term ‘neuter’ seem irrelevant.

          ...

          The following examples below show some of the types of agreement a noun can trigger.

          (2) a.

          I-n-tombazane en-hle yami i-ya-cul-a. 9det-9ncprx-girl 9-beautiful 9my 9sm-dj-sing-fs

          ‘My beautiful girl sings.’

          b.

          U-m-fana omu-hle

          1det-1ncprx-boy 1-beautiful 1my 1sm-dj-sing-fs ‘My beautiful boy sings.’

          Zulu has ten singular noun classes and six plural ones. In example (2), intomba- zane ‘girl’ has noun class 9 and umfana ‘boy’ is noun class 1, and the adjective and possessive modifying the noun and the verb in the example have different forms depending on the noun class. Most Zulu noun classes are characterized by a noun class prefix (glossed separately only in (2)), such as the prenasalization in intombazane and the moraic m of umfana in (2)."

          from: https://www.ingentaconnect.com/content/jbp/avt/2012/00000029/00000001/art00004?crawler=true

          <Thinks: and I was told that Finnish was difficult to learn.>

          1. LionelB Silver badge

            Re: "they". - Gender

            And that's not even the end of it. Zulu (along with Xhosa, Sesotho and some other Nguni languages) has a formal "respectful" spoken form called hlonipha - mostly for married women when addressing relatives.

            PS. The Basque language Euskerra - one of the European language isolates along with Finnish and Hungarian - is, I'm assured, even more grammatically byzantine than Finnish. My partner is a fluent, although not native, Euskerra speaker. Personally, I have got as far as hello, goodbye, beer, wine, cider, snacks and "cheers".

            1. Eclectic Man Silver badge
              Joke

              Re: "they". - Gender

              "I have got as far as hello, goodbye, beer, wine, cider, snacks and "cheers""

              Reminds me of a lecture given by Saharon Shelah* I attended in the 1980's. I genuinely understood everything up to and including "Hello". The next thing I understood, after a bit of a blur and 45 minutes was "Does anyone have any questions?"

              *A brilliant mathematician (logician) https://en.wikipedia.org/wiki/Saharon_Shelah

              (Joke alert icon, but a true story.)

              1. LionelB Silver badge

                Re: "they". - Gender

                Similar experience at a Stephen Hawking lecture, around 1978. At the time he could just about vocalise, and lectured via one of his grad students "translating" for the audience. I'd just started Part III at Cambridge DPMMS, and he was next door at DAMPT, so I thought I'd drop in to see the (already) great man at work. As I recall, he lectured on the entropy of black holes. Despite a reasonable knowledge of differential geometry and Riemannian manifolds, he lost me at "Good afternoon".

    5. ssokolow

      Re: Aria Beingessner

      Singular they has been around since the 14th century. It was mid 18th-century and 19th-century grammarians who tried to stamp it out in favour of gender-neutral "he".

  6. Tony Pott

    Stated another way

    All the issues discussed flow from C running on top of a modern OS or on top of the hardware abstraction layer of the OS which is written in C, and a processor geared to servicing the needs of that OS. There it emulates a low level language, and is extremely useful doing it: so good, in fact that all modern OS's and their key libraries are written in C, and very few problems have arisen (the fact that it is possible to list named examples of problems illustrates how few they are). For every instance of a OS based system, there are a hundred embedded devices too small to require an OS, and the code there is written in C and it works on the bare metal.

    The article (and the ACM piece it draws from) could more accurately be rewritten as 'C is vital to modern computing, and does it's job very well, but not perfectly.' Of course, this, while evidently true, is so evidently true as to be boring and unworthy of reporting. <troll> Much more interesting to report on the mitherings of the proponents of newer languages who don't want to do the work required to get their new toys to work well with existing infrastructure. </troll>

    1. Anonymous Coward
      Anonymous Coward

      Gotta disagree that

      "very few problems have arisen"

      The imperfect or inexpert use of C was at the heart of decades of root exploits and late nights patching systems. It leaked peoples personal and financial data(not that other languages haven't or didn't, but there are whole classes of bugs we cataloged that existed because the coders refused to use a platform that had fixes and protections baked in. They also then (being human) also failed to catch their own preventable mistakes before they inflicted them on the world at large. It held back the use of advanced threading, and held back the work of a bunch of C++ coders to make their code safer, because the maintainers parts of underlying OS didn't want to keep up with the times.

      Yes, C seems to have been nearly problem free, and didn't spawn a serious attempt to improve on it about every 5 years or so.

      That snark aside, I understand that due to past trauma induced by Java, VisualBasic, ADA and other horrors that at one point or another tried to justify their existence by claiming they were a better, newer, and safer replacement for C. I really just mean to point out that C had more that it's share of bumps, and to encourage people to embrace the newer members of the C family, which are a little less like handing a monkey a loaded gun.

      1. jake Silver badge

        Re: Gotta disagree that

        "The imperfect or inexpert use of C was at the heart of decades of root exploits and late nights patching systems."

        The imperfect or inexpert use of screwdrivers was the heart of decades of broken cars and late nights and weekend work for expert mechanics.

        Do you blame the screwdriver, or the teenager wielding it, for the broken car?

        1. yetanotheraoc Silver badge

          Re: Gotta disagree that

          I'm a screwdriver salesman, and I think you already know what I'm going to say.

          1. jake Silver badge
            Pint

            Re: Gotta disagree that

            Well, what are you waiting for? Set 'em up, I'm buying :-)

          2. Dave 126 Silver badge

            Re: Gotta disagree that

            > I'm a screwdriver salesman, and I think you already know what I'm going to say.

            Yeah I do, because you're all torque

          3. adam 40 Silver badge

            Re: Gotta disagree that

            "Would you like that with ice, sir?"

    2. fg_swe Bronze badge

      FALSE

      HP MPE

      Algol mainframes from ICL, Unisys/Burroughs, Moscow

      Singularity OS from MSR

      Redox OS

      None of their kernels are written in C.

      1. Dan 55 Silver badge

        Re: FALSE

        Only one of those might have a chance of being usable for doing work on sometime in the next decade, and that's because they copied UNIX's homework.

        1. fg_swe Bronze badge

          Re: FALSE

          HP MPE was used by thousands of businesses doing commercial work. It was a semi-mainframe OS connecting thousands of end users to a single multiprocessor machine. Customers were very happy with it, as it was rock-solid and efficient. It got killed because in the late 1990s the MBAs of HP were more interested in pushing Oracle, SAP and Microsoft.

          The MPE kernel was implemented in a kind of Pascal.

          https://en.wikipedia.org/wiki/HP_Multi-Programming_Executive

          Here you can see how much users loved it: http://3kranger.com/OpenMPE/omaboutus.shtm. They begged HP to give them the source code, so that they could continue use it.

          1. Dan 55 Silver badge

            Re: FALSE

            And now it is not available or supported. None are apart from the last one, which is an pre-alpha Rust copy of UNIX.

      2. Anonymous Coward
        Anonymous Coward

        Re: FALSE

        ... and all of them deader than Elvis.

        1. jake Silver badge

          Re: FALSE

          Elvis is not dead, elvis is merely not currently in need of updating.

        2. fg_swe Bronze badge

          Also FALSE

          Both Fujitsu and Unisys still sell Algol mainframes.

  7. karlkarl Silver badge

    Arguably C is one of the very few programming languages.

    For example, most languages such as Python, Lua, CSharp.NET/Java are just C programs that interpret glorified text files and do different things depending on whats in them.

    Rust, Go and Swift are languages that finally started to appear once people got bored of the old Limbo-style VM languages like .NET and Java.

    However much of a developers life using them is spent writing wrappers around C libraries! Bindings suck but you either write them yourself or you rack up technical debt dragging in thousands of packages from NPM, crates.io, PIP, CPAN, etc.

    1. Arthur the cat Silver badge

      A tweet from a couple of days back by Colin Percival on the advantages that working in C can have, even where most of us might not think of using it.

    2. Anonymous Coward
      Anonymous Coward

      Almost

      While I mostly agree with you, the compliler/interpreter thing is less a part of the language then you make it seem. There have been C interpreters (turboC and QuickC if memory serves) and people have written compilers for most of the languages on that list at least on some architectures(though most were an academic exercise). A programming language is not just a complied language or a compiler for same.

      As for the VM based langages, .Net is still chugging along and healthier than ever, though if I could pound a blessed stake through the heart of the JVM I would have done so years ago.

    3. fg_swe Bronze badge

      In this language, you can simply inline C++ code, including the calling of C functions:

      http://sappeur.ddnss.de/SAPPEUR.pdf

      (see inline_cpp[[ ]])

      Of course inline_cpp sections should be few, small and written by a senior engineer with serious C++ experience. They are not memory safe.

      1. Dan 55 Silver badge
        Meh

        RAII and stl couldn't disagree with you more.

        1. fg_swe Bronze badge

          RAII in Sappeur

          Of course you can use RAII in Sappeur and you can also do generic programming. Either using the m4 macro processor (now preferred) or using Sappeur generic code.

          You can think of Sappeur being a memory safe C++ version. A C++ without built-in landmines.

          Finally, inside inline_cpp[[]] you can use C++ STL code and your existing classes. No longer guaranteed memory safe, of course !

          RAII in C++ can still easily create memory bugs, especially if your code does multithreading.

  8. Mike 125

    "My problem is that C was elevated to a role of prestige and power, its reign so absolute and eternal that it has completely distorted the way we speak to each other."

    No. Your problem is with History. Just grow up and damn well deal with it.

    Really getting sick of these people whining about a revolution that happened 50 years ago, and blaming all their current problems on that.

    If you can't take it, hand over to those who can.

    1. fg_swe Bronze badge

      "revolution"

      More a Race To Cheapness.

      See Algol Mainframes.

  9. mobailey

    Not a Language?

    re: "it's very difficult to parse; there are multiple competing and subtly incompatible implementations; and then there are the complex ways that C defines and handles integers."

    You can say the same about Welsh, and that's still considered a language.

    -mobailey

    1. Doctor Syntax Silver badge

      Re: Not a Language?

      "You can say the same about Welsh"

      Even the bit about integers?

      1. Ken Hagan Gold badge

        Re: Not a Language?

        Yes. Look it up on wikipedia.

        1. Dan 55 Silver badge

          Re: Not a Language?

          No worse than French. And we all like French don't we?

          1. fg_swe Bronze badge

            Re: Not a Language?

            English speakers use French in every third word when they do not use Germanic and Viking. If the speaker wants to appear well-educated, he will use even more French.

            1. mobailey

              Re: Not a Language?

              re: "If the speaker wants to appear well-educated, he will use even more French."

              Bon point.

              -mobailey

            2. adam 40 Silver badge
              Headmaster

              Re: Not a Language?

              Quod Est Demonstrandum

              1. Eclectic Man Silver badge
                Headmaster

                Re: Not a Language?

                Errr, should that be "Quod erat demonstrandum"?

                1. Dan 55 Silver badge
                  Happy

                  Re: Not a Language?

                  No, the middle word is French.

                  1. Eclectic Man Silver badge
                    Unhappy

                    Re: Not a Language?

                    Sacre Bleu!

                    Quelle idiote! I've been and gorn and made a fool of myself, again.

                    How humiliating.

                    Well, after learning 'C', maybe I should try learning a human language.

    2. Someone Else Silver badge

      Re: Not a Language?

      You can also say that about English...See my post above about the Tejano dialect...

    3. Dave 126 Silver badge

      Re: Not a Language?

      > You can say the same about Welsh,

      I could, but I couldnt say it *in* Welsh.

  10. Anonymous Coward
    Facepalm

    Oh, boy ...

    > [ ... ] C isn't a programming language anymore.

    After 50 years - and counting - of software written in C, that's a pretty bold statement.

    > There are many problems with the C language.

    Yes, indeed. C requires the programmer to think. As opposed to copy-pasta from StackOverflow. In 2022 that may be a bridge too far.

    > it's very difficult to parse.

    No, it's not. Actually, in terms of parsing and AST generation, it's one of the cleanest programming languages ever invented.

    Wanna talk about hard to parse languages? Try C++ or Java.

    In fact, there exists at least one freely available implementation of a C11 LALR(1) grammar, written for Lex/Yacc. Obviously it works with Flex and Bison as well:

    http://www.quut.com/c/ANSI-C-grammar-l-2011.html

    http://www.quut.com/c/ANSI-C-grammar-y-2011.html

    It's been around since 1985 or thereabouts, for various iterations of the C Standard.

    Yes, http. I don't know why it's not https, but such is life.

    And then there's this:

    https://hal.archives-ouvertes.fr/hal-01633123/document

    Seriously? Stuck figuring out the ambiguity between the * pointer operator and the * multiplication operator because C is not a context-free grammar?

    I confess: I stopped reading after that.

    Free hint: there's three possibilities: a binary operation, a unary operation or a declaration. Figure it out.

    If you haven't figured out that this won't be resolved during the initial AST generation pass, and you need Sema, stop trying to write parsers. Go back to copying and pasting Python, Go or Rust.

    1. naive

      Re: Oh, boy ...

      Your comment couldn't be more spot on. With C Brian Kernigan and Dennis Ritchie found the goldilocks zone between unforgiving hardware assembly programming and inefficient ideas of dreamers.

      C is humble, easy to learn and powerful.

      Those who know the pdp-11 would agree that comparing its instruction set to the C programming language is actually a great compliment for C, since pdp-11 is an example of a beautiful and clean system architecture, contrary to for instance Intel which is a total mess of garbled and mostly failed ideas

      C brought us where we are today, it is relative easy to build compilers for various hardware, hence the unparalleled success of Unix on countless processor types and applications running directly on micro controllers.

      The "its from the 70's !!" argument the author of the article highlights so triumphantly, gets old as well, in the 70's great accomplishments and advances in technology were realized, which are the basis of the IT technology we use today.

      1. Roland6 Silver badge

        Re: Oh, boy ...

        > since pdp-11 is an example of a beautiful and clean system architecture, contrary to for instance Intel which is a total mess of garbled and mostly failed ideas

        The i86 segmentation model did my head in when developing the code generator for a C compiler...

        1. Gary Stewart

          Re: Oh, boy ...

          The DJGPP compiler was a godsend as a free DOS 386 and above protected mode C compiler. After it came out I never used a segmentation mode compiler again. It is still available in a C/C++ version. Thank you DJ Delorie!

          I also had occasion to use the Phar Lap protected mode DOS compiler while writing SCSI drivers for a telephone call center data base server. Both of the SCSI interface boards we used had first party DMA on the ISA bus which allowed them to supply the address/data/control for reads and writes to memory. This of course required getting the right address to use. The original code I wrote used the segmented Microsoft C compiler which had a particularly nasty and not well documented bug that would select the wrong segment pointer which resulted in the inevitable DOS crash. There was a work around that luckily for me another programmer in the company knew about so it did get fixed once he was told about it. This was not a problem with Phar Lap although there were a couple of well documented hoops to jump through to make it work.

          1. jake Silver badge

            Re: Oh, boy ...

            "protected mode DOS compiler while writing SCSI drivers for a telephone call center data base server."

            I think I see your problem.

      2. fg_swe Bronze badge

        C = Hamburger Of Software Engineering

        First and foremost C is cheap. It was "given away" and it is cheap to learn.

        But the resulting exploitable bugs are extremely expensive to fix and mitigate.

        That is why reasonable people dont eat burgers every day. Such eating is only cheap on the short run.

    2. Hi Wreck

      Re: Oh, boy ...

      I chuckled quite a bit about the "difficult to parse" bit, especially since Aho literally wrote the (Dragon) book on parsing.

      1. Roland6 Silver badge

        Re: Oh, boy ...

        Interesting, a computing book that seems to have aged well: first edition 1977, second edition 2006.

        I suspect many joining the profession today would not give an IT book written in the late 1970's~early 1980's a second glance, thinking the information it contained to be outdated, but they would be wrong.

        Whilst I'm sure the new edition does contain much useful new material, I suspect for an undergraduate course in compiler techniques the first edition is probably still more than sufficient and much easier on the pocket.

        1. jake Silver badge

          Re: Oh, boy ...

          About 20 years ago, I tried to shake up my sysadmin/syssecurity class by bringing in an ancient DEC machine running TOPS-10 and a bunch of dumb terminals[0]. My texts were mostly Tannenbaum. Several of the class complained to the administration, and a ruling came down that my "teaching platform was archaic and irrelevant in the modern world". Keep in mind I was trying to teach CONCEPTS, not applications.

          I replaced the DEC with donated/salvaged PCs running BSD (servers & routers) and Slackware/KDE (desktops). Didn't cost the school a dime. The same group of students complained. Out went the free system, and in came a Windows based network. The mind absolutely boggles.

          [0] IMO, DEC kit is hands-down the best platform for teaching computers and networking ever invented.

          1. Roland6 Silver badge

            Re: Oh, boy ...

            > DEC kit is hands-down the best platform for teaching computers and networking ever invented.

            wrt networking. I suggest it needs to be DECnet Phase IV or earlier, whilst I approved of the Phase V shift to OSI, Phase IV was probably simpler from a teaching perspective.

            > My texts were mostly Tannenbaum.

            Funnily enough, I think Tanenbaum's first edition of Computer Networks gives a better introduction to networking than the later editions that became focused, firstly on OSI and the on TCP/IP to the exclusion of practically everything else.

      2. adam 40 Silver badge

        Re: Oh, boy ...

        Yacc and Lex do that.

        Oh, and they are also written in C.

        So it pretty much parses itself.

    3. Fifth Horseman

      Re: Oh, boy ...

      The only real 'gotcha' with parsing C that I can think of arises from the pre/post increment/decrement operators. According to the language definition, the expression:

      x = a+++b;

      could actually have two meanings: 1) add a to b+1, add 1 to b, or 2) add a to b, add 1 to a. But:

      x = a+++++b;

      can only mean add a to b+1, add 1 to a, add 1 to b. A parser that correctly tokenises this effectively sets the meaning of the first expression to case 1. 'Maximal Munch', if memory serves. Admittedly a fringe case, and only a deliberately obtuse programmer would ever write anything like that.

    4. ssokolow

      Re: Oh, boy ...

      > No, it's not. Actually, in terms of parsing and AST generation, it's one of the cleanest programming languages ever invented.

      It's a context-sensitive grammar. That's why the lexer hack is needed.

      Context-free grammars are unarguably easier to parse.

  11. kibblewhite

    So if C isn't a progrmaming language, what would you say is a programming language?

    Am I right in stating, that from your article that any language that needs an OS is "not" a programming language, or have I misunderstood the point here?

    The reason I felt compelled to respond as I don't believe what you are saying is correct, actually, I feel it is very wrong. Anyways, moving on with life, I'm going back to programming, ermm actually maybe I shouldn't be calling it programming anymore, ermm developing perhaps, so confused by this article now...

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: So if C isn't a progrmaming language, what would you say is a programming language?

      [Article author here]

      Yes, you have misunderstood the point. Completely, in fact.

      I suggest re-reading it *and* following the links and reading the original author's explanation closely, too.

      1. Anonymous Coward
        Anonymous Coward

        Re: So if C isn't a progrmaming language, what would you say is a programming language?

        Seriously, the point has been made several times over this thread, that people missed the part where the claim was that it is NOT JUST a programming language, it is MORE THAN A PROGRAMMING LANGUAGE.

        Ease of that down vote trigger, it's aimed at your foot.

      2. Liam Proven (Written by Reg staff) Silver badge

        Re: So if C isn't a progrmaming language, what would you say is a programming language?

        10 downvotes. A new personal best.

        All right. Let me try again.

        > Am I right in stating, that from your article that any language that needs an OS is "not"

        > a programming language, or have I misunderstood the point here?

        No, this is not the point. This is not the claim of me or Aria B, and it is not even hinted at by any line in my article or their blog post.

        Honestly I do not see where you got that from so I can't really respond to it.

        The primary points of Aria B's blog post are:

        [1] It is *impossible* to escape the problems of C by working in another, safer language, *even if that language is not implemented in C*, if you need to run on (and therefore talk to) an OS that is written in C and lacks any other form of API.

        If the only available API of your underlying OS is C-based, then a program that is written in something else and has absolutely no C code anywhere in the entire toolchain _will still suffer from C vulnerabilities._

        [2] And it's worse than even that. If your program in some safe new language P does not need to do much with the OS but it needs to talk to a separate program in a totally separate language Q which *also* has no C in it anywhere including not a line of C in its implementation, you will *still* suffer from C-related problems because the only interface between language P and language Q *is specified with C structures and C pointers and C calls*.

        From 1 and 2 the conclusion is:

        [3] C is not just a programming language. It is an IDL.

        https://en.wikipedia.org/wiki/Interface_description_language

        And it is even worse at being an IDL than it is at being a programming language, and it's bad at that.

        From 1, 2 & 3, the unstated conclusion, it seems to me, is:

        [4] For reliability, OSes need to have separately-specified APIs that are not C-based and not implemented in C, because if they don't, all programs for that OS will inherit the problems of C.

        > The reason I felt compelled to respond as I don't believe what you are saying is correct

        You are responding to something I didn't say and neither did Aria B.

        > So if C isn't a progrmaming language, what would you say is a programming language?

        I didn't say that and neither did Aria B.

        What they said and I described is *not* "C is not a programming language*.

        We said: "C is not *JUST* a programming language.*

        It is *ALSO* an IDL as well.

        And it is a very bad IDL.

        1. Anonymous Coward
          Stop

          Re: So if C isn't a progrmaming language, what would you say is a programming language?

          > It is *ALSO* an IDL as well.

          > And it is a very bad IDL.

          And this is where both you, and Aria, are fundamentally confused.

          You are confusing an Interface -- the 'I' in 'IDL' -- with an implementation.

          What the Python/Rust/Language-Of-The-Month crowd are saying is that the ABI and the C Calling Convention have permeated throughout the entire operating system, imposing a set of constraints, and that it is impossible for any software, of any kind, to get away from these constraints.

          First: neither an ABI nor a calling convention are IDL's. These are implementations, not interfaces.

          Second: an ABI, and consequently a Calling Convention, are determined primarily by the hardware, and secondarily by choices made by the ABI implementors.

          If we're talking about Linux, or the BSD's, or commercial UNIXes, we're talking about the SVR4 ABI - with minor differences between implementations.

          The C Calling Convention - which is the calling convention found in all UNIX-like OS's - is a consequence of the ABI's constraints. Once the ABI has been defined, there isn't much wiggle room in implementing a calling convention.

          Want to complain about the ABI? Complain to the chip manufacturers. That's where all the constraints about the stack execution model, function argument passing and process memory image come from.

          In terms of numerics: you may have had a point about C integers before C99. After C99, there is nothing to complain about. C99 and higher provide a set of known and fixed sized integers with the int_* and uint_* family.

          If you want to complain about fractional values, the IEEE-754 Standard - which is what all UNIX-like flavors implement - is an independent specification from the C Standard. The C Standard mentions it in one of the addendums, but it is not normative to Standard C.

          And IEEE-754 has nothing to do with programming languages anyway. It's a hardware encoding specification, not a programming language specification. FORTRAN has IEEE-754 fractional values, and FORTRAN doesn't necessarily follow the C Calling Convention. In practice it does, because of interoperability between C and FORTRAN programs.

          IBM mainframes use a fundamentally different representation for fractional values. And you can still compile C programs and run them on IBM mainframes. Same for DEC VMS.

          Lastly: FFI - The Foreign Function Interface - does not belong in the kernel. It's a crude hack to begin with.

          If Rust has problems inserting itself in the Linux kernel because of FFI, my only thought is: Great!

          Either you implement a proper and compatible ABI and a proper and compatible Calling Convention in your flavor-of-the-month language, or you don't belong in the kernel.

          1. ssokolow

            Re: So if C isn't a progrmaming language, what would you say is a programming language?

            > What the Python/Rust/Language-Of-The-Month crowd are saying is that the ABI and the C Calling Convention have permeated throughout the entire operating system, imposing a set of constraints, and that it is impossible for any software, of any kind, to get away from these constraints.

            No, what Aria is saying is that "the C ABI" is so implementation-defined that people resort to baking in a dependency on a C compiler when all they need is for two non-C languages to interoperate with each other without talking to any C libraries... in other words, for want of an IDL, C has been forced into being used as an IDL despite being ill-suited for it.

            > If we're talking about Linux, or the BSD's, or commercial UNIXes, we're talking about the SVR4 ABI - with minor differences between implementations.

            Aria specifically points out an example where GCC and LLVM on the same target triple (eg. GCC on Linux on x86_64) don't agree on the representation of a type specified by the relevant documents.

            > Yeah!

            >

            > clang and gcc can’t even agree on the ABI of __int128 on x64 linux. That type is a gcc extension but it’s also explicitly defined and specified by the AMD64 SysV ABI in one of those nice human-readable PDFs!

            >

            > I wrote this dang thing to check for mistakes in rustc, I didn’t expect to find inconsistencies between the two major C compilers on one of the most important and well-trodden ABIs!

            ---

            > Lastly: FFI - The Foreign Function Interface - does not belong in the kernel. It's a crude hack to begin with.

            I get the impression you have an artificially narrow perspective of what FFI actually is.

            In a language like Rust, it's really just a promise that the language's support for specifying calling conventions and memory representations will include options matching those used by C on the target platform.

            Inline assembly language is effectively a form of FFI... Hell, the initial bring-up on a pre-UEFI PC requires assembly language to talk to the BIOS to get things into a shape C can work with.

            > If Rust has problems inserting itself in the Linux kernel because of FFI, my only thought is: Great!

            Rust supports the C ABI just fine. The problem is that the *way* non-C languages are forced to support the C ABI is made needlessly onerous and makes fixing mistakes in C more painful than necessary because so many applications and libraries have hard-coded implementation details from their platform's C compiler/libraries for lack of a proper interface definition.

            One more quote from the post:

            > A lot of code has completely given up on keeping C in the loop and has started hardcoding the definitions of core types. After all, they’re clearly just part of the platform’s ABI! What are they going to do, change the size of intmax_t!? That’s obviously an ABI-breaking change!

            >

            > Oh yeah what was that thing phantomderp was working on again?

            1. Anonymous Coward
              Facepalm

              Re: So if C isn't a progrmaming language, what would you say is a programming language?

              > No, what Aria is saying is that "the C ABI" [ ... ]

              I wish she/he didn't say that.

              There is no such thing as a "C ABI". There are various ABI's implemented by various Operating Systems: SVR4 with all its minor variants, Windows, etc.

              There is, however, a C Calling Convention. The Calling Convention is subordinate to the ABI. Please learn the difference between the two.

              > clang and gcc can’t even agree on the ABI of __int128 on x64 linux

              Good. That's because __int128 is not a C Type. It does not exist in the C Standard. Hence the double underscore preceding the actual type name.

              __int128 is a compiler extension. Which means: every compiler is free and welcome to do whatever it wants to do with __int128. There is no compatibility guarantee.

              For someone such as Aria - who claims to be a major Rust and/or Swift implementor - this amount of confusion and ignorance is a showstopper. It does, however, explain quite a bit about Rust.

              If you were to make this statement about __int128 in a first-round technical interview for a compiler job, this would be the terminating statement. As in: the interview is over. Yes, you are expected to know the normative C Types for a compiler job.

              But this doesn't surprise me a bit. I once interviewed a guy who had no clue what xor did. His Resume claimed that he had extensive experience writing compiler backends and codegen's.

              1. ITMA Silver badge

                Re: So if C isn't a progrmaming language, what would you say is a programming language?

                "I once interviewed a guy who had no clue what xor did. His Resume claimed that he had extensive experience writing compiler backends and codegen's."

                That reminds me of an MCSE back in the early part of this millennium who asked us "non MCSEs" -

                "Which one is the serial port?"

                Didn't know what the serial port on a laptop looked like (back when most laptops had a 9 pin D-Sub serial port).

              2. ssokolow

                Re: So if C isn't a progrmaming language, what would you say is a programming language?

                > There is, however, a C Calling Convention. The Calling Convention is subordinate to the ABI. Please learn the difference between the two.

                That's fair... but also comes across as arguing whether the terminology is accurate when you can understand the point that was intended as an excuse to avoid addressing the intended point... which makes you look like a bad faith actor.

                (Had you addressed both the terminology and the point, that wouldn't be an issue.)

                > Good. That's because __int128 is not a C Type. It does not exist in the C Standard. Hence the double underscore preceding the actual type name.

                The sentence you omitted, immediately following what you quoted, is "That type is a gcc extension but it’s also explicitly defined and specified by the AMD64 SysV ABI in one of those nice human-readable PDFs!"

                That is, it began as a GCC extension but is now a standard and, even if it weren't, LLVM has put a lot of work into matching GCC on Linux and MSVC on Windows for interoperability's sake.

                As such, it's not a statement about "What is the standard?" but "How effective are attempts to achieve interoperability, regardless of what's standard?"

                Seen in that light, this comes across as another case of "bad faith actor arguing terminology to avoid addressing the point".

                1. Anonymous Coward
                  Facepalm

                  Re: So if C isn't a progrmaming language, what would you say is a programming language?

                  > AMD64 SysV ABI in one of those nice human-readable PDFs

                  That has nothing to do with the C Standard.

                  You have no clue what you are talking about. It's not my problem that you don't understand established terminology and you confuse terms.

                  Save yourself and stop typing.

                  1. ssokolow

                    Re: So if C isn't a progrmaming language, what would you say is a programming language?

                    I never said "The C Standard" and Aria didn't either. You just assumed that. (Feel free to go back and look through my comments.)

                    The closest Aria said was "There are standardized calling conventions and ABIs for each platform!" as in things like the aforementioned AMD64 SysV ABI.

                    We've both been talking about "the de facto standard ABI all things on a given platform expect in order to interoperate, which people colloqially refer to as 'the C ABI'", and Aria's point was that it's not as standard/consistent within a platform as everyone assumes.

                2. Roland6 Silver badge

                  Re: So if C isn't a progrmaming language, what would you say is a programming language?

                  >That is, it began as a GCC extension but is now a standard and, even if it weren't, LLVM has put a lot of work into matching GCC on Linux and MSVC on Windows for interoperability's sake.

                  All this is in Rust's and Swift's future, take note. ITs relatively easy to create a (new) Standard, harder to maintain it over the decades as the capabilities of system platforms improve and people use the language in ways not envisaged by the original language creators...

  12. Anonymous Coward
    Anonymous Coward

    Programme in C, think in C -- sad but true (for any language, actually)

    What was it Wirth said about the choice of programming language affecting the way you think about a problem? (I'll find it, I'll find it.)

    1. Dan 55 Silver badge

      Re: Programme in C, think in C -- sad but true (for any language, actually)

      And yet whither Pascal?

      By the way, the calling conventions that were lost to the mists of time were Pascal-based and those didn't allow for function overloading either. In fact they didn't even allow for variable arguments, unlike C-based calling conventions.

      LLP64 is because MS wanted 32-bit ints with 64 bit addresses. That was yet another MS "backwards compatibility above all else" choice, not a C choice, and it's unfair to blame C for that.

      So in other words C is getting the blame for being the successful language. If Rust ends up beating C a decade from now then its calling convention and linker method will become the standard. Or perhaps in some future utopia C++ and Rust compiler and language gurus will sit down and work together to come up a standard way to mangle names that works for all languages. However complaining right at the start that everyone else is wrong when Rust hasn't even proved itself yet is a bit too much.

    2. dafe

      Re: Programme in C, think in C -- sad but true (for any language, actually)

      That wasn't Wirth, that was Stroustrup.

  13. Anonymous Coward
    Anonymous Coward

    Other languages....

    The problem with other languages is that they aren't a lot better.

    Either there are performance issues (Java etc), verbosity (Pascal etc), or you end up looking at rust or golang, which pretty much look the same.

    Actually the one I can't really knock is object pascal, but it's really just C with lots of typing and probably quiche.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Other languages....

      You're missing the point. The core argument here *only* applies to C, nothing else.

      I suspect you haven't read the article closely enough, or the original post either.

      It makes a valid and important point, which is why I wrote this article. I suggest reading it again, closely. It's worth your time.

      1. Anonymous Coward
        Anonymous Coward

        Re: Other languages....

        I guess the article is trying to say that C has transcended beyond being a programming language into a fundamental and irreplaceable part of the software ecosystem without which nothing else can exist.

        But the article is so nonsensical that it's hard to tell. If it was code I'd have to load it into a debugger and step through it to figure out what it does.

        1. Jonathon Green
          Boffin

          Re: Other languages....

          Mostly I think it’s saying that writing portable code is hard[1], and that using ABIs which were first defined 50 years or so ago and have been subject to constant revision/extension while trying to retain something which if you squint a bit and don’t get too close looks a bit like backwards compatibility doesn’t help[1].

          What any of the authors expect to achieve by these observations is somewhat unclear though. I’m sure that with the benefit of hindsight we can come up with better ways of describing and implementing ABIs but making the case for one of them compelling enough to rip up huge bodies of code to implement the newer, better, shinier models while retaining compatibility with other huge bodies of work which depend on the old ABIs could prove somewhat of a challenge. It’s a classic example of “Well, I wouldn’t start from here…”

          [1] Which isn’t the most controversial or profound statement I’ve ever heard by a long chalk.

        2. Liam Proven (Written by Reg staff) Silver badge

          Re: Other languages....

          No, it is not saying that.

          It is saying that C is now an IDL.

          And that it is not only an IDL between programs and the OS, but between different programming languages.

          1. bazza Silver badge

            Re: Other languages....

            I was interested in the view of C as an IDL. It is, of course, not the only one. DBus is quite successful in its IDL.

            I think that the reason why C has entered into this position is that, being the natural lowest common denominator, it's the only one that doesn't need world-wide agreement. Getting widespread agreement on an alternative would be a futile exercise; anyone who knows developers knows that they can't agree (e.g. the 64bit int fiasco, the heated discussion on these forum pages...)

            My 2p's worth is that the best IDL we currently have is ASN.1, made very universal with its myriad encoding rules to suit all sorts of purposes. JSON schema and XML XSD schema are equally sophisticated, but lack the compact binary encoding rules provided by a good ASN.1 implementation. XMLs tools are often terrible. There remains the matter of how ASN.1 messages are exchanged - ZeroMQ wouldn't be a bad bet.

            DBus is kind of halfway there; it has a simpler schema language that's feature poor (no constraints), but does have a message orientated protocol between client and server.

            Over the years, there's been plenty of nasty bugs (e.g. heartbleed) that would have been more easily avoided, had some thought been given to interfacing.

            1. Liam Proven (Written by Reg staff) Silver badge

              Re: Other languages....

              Thank you for that.

              I am so very glad that someone got it. :-)

              1. Mike 125

                Re: Other languages....

                Seems to me reality has finally hit home. And the new language people are kicking their toys out of the pram, blaming history, because reality is harder than they thought.

                No sensible person is arguing against a C replacement, for the appropriate use case. But to be accepted, it must be shown to solve real problems, in the real world.

                As for 'getting it', I think most people get it. But if an abstract IDL is required, then maybe the new language designers should have considered that a teensy bit earlier! Then demonstrate why it's needed by their new language, why it's better, and how that solves real world safety problems.

                We're (hopefully) in a transition phase. And during that phase, it will get messy. (Traditionally, this is known as 'wrapper hell !')

                But it helps nobody to just rail against the current state of things, just listing endless new examples of how reality is hard.

                1. yetanotheraoc Silver badge

                  Re: Other languages....

                  Great comment.

                  "wrapper hell"

                  Exactly. Another intermediary so the blog author can do a memory- and type-safe round trip Rust -.> wrapperRC -.> C -.> wrapperSC -.> Swift. Then they can swap out for another IDL if their system isn't written in C (which they are hoping will be sooner rather than later). The result then looks like Rust -.> wrapperRC -.> wrapperCZ -.> Z -.> wrapperCZ -.> wrapper SC -.> Swift.

                  Paul Crawford's quote about abstraction was spot on.

                2. ssokolow

                  Re: Other languages....

                  I think this quote from Aria's original post is more to the thrust of things:

                  > Yeah!

                  > clang and gcc can’t even agree on the ABI of __int128 on x64 linux. That type is a gcc extension but it’s also explicitly defined and specified by the AMD64 SysV ABI in one of those nice human-readable PDFs!

                  > I wrote this dang thing to check for mistakes in rustc, I didn’t expect to find inconsistencies between the two major C compilers on one of the most important and well-trodden ABIs!

                  1. Mike 125

                    Re: Other languages....

                    Just read back through your comments and now have a much better perspective on this. Thanks.

                    >I wouldn't code in C++, but that's because, as a responsible person, I recognize my own limitations.

                    "Man's got to know his limitations" - Harry Callahan

                    >As for comfort level, I'd say that I *could* get more comfortable with C++, but I just generally don't feel comfortable being responsible for correctly using languages that make accidental memory-unsafety so easy.

                    That sums up why things *must* change.

                    C's legacy, good and bad, is vast. It's spread into use cases for which there is no justification whatsoever- a bit like Windows on warships.

                    Hence the painful transition, probably waiting on a new generation of programmers. But from the perspective of the old one, it's damn fascinating to sit back and watch!

          2. Roland6 Silver badge

            Re: Other languages....

            >It is saying that C is now an IDL

            And that is because of the success of Unix and its API's.

            To displace C, you need to develop and bundle with Unix/Linux native API's for other languages, specifically application orientated languages such as COBOL, FORTRAN etc.. Naturally, in developing native API's for other languages, there will be ramifications back into the Unix/Linux kernel and the assumptions arising out of the assumption that applications would be written in C; this may fundamentally change Unix/Linux and break backward compatibility.

            1. dafe

              Text is a universal interface

              Ken Thompson said that, and I think he knows a bit about inter-procrss communication.

          3. Anonymous Coward
            Anonymous Coward

            Re: Other languages....

            "It is saying that C is now an IDL."

            Yes, the operating system is written in C, so the interface to the operating system, which is function calls is defined by C. Had the OS been written in Pascal then Pascal would define the interface.

            Obvs. Innit, we're discussing other things because there's not much to discuss on it.

            You could define an interface in other ways that isn't C or Pascal or whatever compatible, but it would require a translation layer which would in turn have a C interface to it.

            You could view the OS as simply a library and on embedded they often still are, then your argument is about a C library having an interface defined in C.

        3. Liam Proven (Written by Reg staff) Silver badge

          Re: Other languages....

          No, the article is not trying to say that, and neither was the blog post.

          What it was trying to say is:

          Nowadays C is an IDL as well as a programming language.

          And when it is used as an IDL its problems are even worse.

      2. thames

        Re: Other languages....

        I read Beingessner's original post. There is no core argument and Beingessner doesn't really have a point. It's just a long meandering rant about some the problems which are inherent in writing a compiler and some of the possible ways of getting around them.

        From the context I gather the author of the original post was working on a number of issues relating to trying to auto-generate Rust bindings for C libraries, how to handle differences in language calling conventions, problems with different underlying hardware having different native integer sizes, and problems with less than ideal user programs having hardware related assumptions hard coded into them.

        It would be nice if all of these were simple to handle, but they inherently aren't. There is and likely never will be either a one-size-fits-all programming language or an ultimate programming language that solves all problems for all time.

        The original post author's attempts to redefine the English language does not somehow make the argument somehow more profound or more correct. It just instead implies that there really isn't a coherent argument to be made.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: Other languages....

          The points of the blog post are:

          [1] C is now an IDL as well as a programming language.

          [2] Even if you implement a new language without any use of C at all anywhere in its toolchain, if the only IDL to the OS is specified in C, then the language will inherit some C issues.

          [3] This also applies if your product uses 2 (or more) completely non-C-based languages because the interface between them will be specified in C.

    2. bombastic bob Silver badge
      Devil

      Re: Other languages....

      I think you could justifiably categorize programming lingos (that aren't assembler) into a few basic categories:

      Procedural: This would be things like FORTRAN and COBOL and old-style BASIC

      Algol-like: C, Pascal, and all of the derivatives from C++ (and maybe Delphi) to Java and (possibly) Rust

      Scripted: shell/Bash, Perl, PHP, and probably Python (though some of this may be debatable)

      and maybe another category for object-oriented, but that's debatable, too.

      And so the difference between C/C++ and Rust might simply be semantics, unless you are a fan of garbage collection memory management (which should be drawn and quartered and embedded on pikes at the edges of the realm as a warning to those who might DARE to try that AGAIN...)

      the real horror began with trying to make scripting lingos "object oriented" though I admit a Perl module can often solve problems that are otherwise VERY difficult, simply because someone else did a LOT of work to make it possible...

      1. doublelayer Silver badge

        Re: Other languages....

        I think this structure doesn't work very well. For one thing, what's the difference between a scripting language and a non-scripting language. Usually, the argument comes down to "I want to say that this one isn't good enough, so it's just for scripts". Lots of small maintenance scripts have been written in C. Many large applications have been written in Python. There's no real distinction.

        Also, procedural is usually defined to include C, because each operation has effects which are present for the next operation. This is often as opposed to functional languages that attempt to avoid side-effects, languages that don't appear to be in your categorization system. This is without considering languages that manage to have both features. And even if you want to use the categories, you need a better name than "Algol-like", because some of those are not like Algol and you can argue for hours about how much alike they can be before the category applies.

        This is like dividing operating systems into program loader, Unix-like (including Windows), RTOS, and just expecting everything to neatly fit in.

        1. jake Silver badge

          Re: Other languages....

          "For one thing, what's the difference between a scripting language and a non-scripting language."

          Conventionally, the former is interpreted and usually quite slow, and the latter is compiled and comparatively fast. This is not a hard and fast rule, there are plenty which straddle the line to one degree or another.

          1. doublelayer Silver badge

            Re: Other languages....

            The examples listed do not function under this rule. Java is listed under procedural, not scripting, but it is interpreted. So even if you choose to use this distinction, the original suggestion wasn't. This is why that categorization doesn't work very well.

        2. jake Silver badge

          Re: Other languages....

          A program loader (ex. MS-DOS) is not an operating system (ex. TOPS-20). In a program loader, the user can fully control all of the hardware without requiring permissions to be set. An OS has complete control of the hardware (pace root/administrative access capabilities, of course).

          Both have their place in computing.

          1. doublelayer Silver badge

            Re: Other languages....

            But they do serve a similar purpose and so it makes sense to compare them. However, it does not make sense to compare them in the simplistic way I did, hence my analogy to the simplistic way to compare programming languages that also doesn't make sense.

        3. dafe

          Re: Other languages....

          There are fundamentally two kinds of languages: Those that can be interpreted, and those that need to be compiled.

          Of course those than can be interpreted can also be compiled.

          Machine language is interpreted, and so are BASIC and FORTRAN and bash and python.

          Virtual machines like the JVM or .NET or the Erlang VM interpret virtual machine code. Which can be text.

          Higher level languages are compiled into languages that can be interpreted.

  14. Roger Kynaston
    Joke

    I think I have the problem

    Other languages have names like java (Eurghhh), Python, Go and so on. What they should really do is write one called D. Then it would rule the world for 50 years and our successors could argue for the creation of a new language - E.

    1. Anonymous IV
      Facepalm

      Re: I think I have the problem

      What they should really do is write one called D. Then it would rule the world for 50 years and our successors could argue for the creation of a new language - E.

      'E' is already in existence in Yorkshire, as in "E bai goom".

      Ba-boom, tish!

    2. Doctor Syntax Silver badge

      Re: I think I have the problem

      The great debate aeons ago was whether the next language would be D or P. It depends on which character sequence you're using in which C follows B.

    3. katrinab Silver badge
      Alert

      Re: I think I have the problem

      D was released in 2001. It does appear to have a dedicated fan-base, though it is that much used.

      C# is of course the next note on the musical scale after C, and is pretty well known.

      In terms of letters of the alphabet programming languages, we seem to be up to at least R. That one is quite widely used in statistics.

      1. Someone Else Silver badge

        Re: I think I have the problem

        No, the next note on the keyboard is Db...

        1. Joe W Silver badge

          Re: I think I have the problem

          Which is lower in pitch than C#

          1. katrinab Silver badge
            Boffin

            Re: I think I have the problem

            On a keyboard, C♯ and D♭ are the same key, but yes, they are completely different notes, and in tuning systems other than equal temperament, they do have different frequencies.

            The moonlight sonata for example is very definitely C♯ minor and not D♭ minor.

      2. tiggity Silver badge

        Re: I think I have the problem

        @katrinab

        R is based on S and so makes a nice example of naming your language in a preceding character style.

    4. Missing Semicolon Silver badge

      Re: I think I have the problem

      D exists.

    5. Liam Proven (Written by Reg staff) Silver badge

      Re: I think I have the problem

      > What they should really do is write one called D.

      https://dlang.org/

    6. David 132 Silver badge

      Re: I think I have the problem

      E exists - I was writing programs for my Amiga in it almost 30 years ago. And apparently, looking at the Wikipedia article, there's at least one other programming language that uses the same name.

      Which is great, or as those well-known advocates for programming language plurality the Shamen put it: E's are good.

    7. Anonymous Coward
      Anonymous Coward

      Re: I think I have the problem

      and F is just Fortran but newer and shorter and we passed F# 15 years ago.

      Might as well cut straight to Zed and be done with it :)

      1. David 132 Silver badge
        Gimp

        Re: I think I have the problem

        Zed's dead, baby, Zed's dead.

        Obvious icon -->

      2. jake Silver badge

        Re: I think I have the problem

        I've been using the z shell (zsh) in appropriate places since ... what ... 1990?

        1. Tim99 Silver badge
          Gimp

          Re: I think I have the problem

          I'm retired, and given up computing, except with an Apple Mac. The "Terminal" is zsh. I use it a lot...

      3. Liam Proven (Written by Reg staff) Silver badge

        Re: I think I have the problem

        > Might as well cut straight to Zed and be done with it :)

        https://en.wikipedia.org/wiki/Z_notation

      4. adam 40 Silver badge
        Alert

        Re: I think I have the problem

        I have recently been programming with RNA, and I started with Alpha.....

        I'm up to Omicron now.

  15. Eclectic Man Silver badge

    Umm

    Well, ok, I guess that I am one of the several people who read the article without properly understanding it, unless it is merely a complaint that as most OSes are written in C, every other language needs to interface with C to do anything useful in the real world, and therefore needs to 'speak C to C', as it were.

    The fact is that whatever language is most prevalent for writing OSes will require any other 'proper' programming language to have some sort of interface to enable system calls.

    At least C did away with the execrable 'GOTO <line number>' which caused me so much heartache as a student of BASIC. C may no longer be 'just a programming language', but I find it extremely useful, and would never had made the most amazing mathematical discovery of my life* without being able to program in it.

    *An amazingly intricate number sequence that still blows my mind when I try to understand it.

    1. bombastic bob Silver badge
      Devil

      Re: Umm

      At least C did away with the execrable 'GOTO <line number>'

      Now it is 'goto code_label;'

      this construct is highly useful for error handling where 'error_exit' is the label after the main 'return' path, and you clean up resources and display error messages or whatever. VERY useful when doing systems programming, and you'll see it used that way in kernel code

      Example on FreeBSD: grep -r goto /usr/src/sys

      (over 30,000 lines were generated by this)

      I used to be an anti-goto purist. Made a few convoluted code constructs to avoid using 'goto' even. Then I saw it used a lot in kernel code, realized the error of my thinking, and started making things tight and efficient instead.

      (did come up with a cheat, to use 'do {' and '} while 0' as macros, where I could use 'break;' to escape it - effectively a 'goto' to the line following the end of that section of code)

      1. Eclectic Man Silver badge
        Joke

        Re: Umm

        bombastic bob: "At least C did away with the execrable 'GOTO <line number>'

        Now it is 'goto code_label;'"

        Noooooooooooooooooooo!!!!!

        Runs and hides.*

        (My days of kernel hacking are long gone, I hope.)

        *Checking in the O'Reilly 'C in a Nutshell', it appears on pages 110-112 (ISBN 978-1-491-90475-6)

        Another illusion shattered.

      2. Gary Stewart

        Re: Umm

        I've used goto on many occasions to reprint the text prompt when there are input errors in CLI programs. It is by far the easiest and fastest way to do it. As a (very) long time assembly language programmer the jump and relative jump instructions which are the assembly equivalents to goto are irreplaceable.

    2. doublelayer Silver badge

      Re: Umm

      "The fact is that whatever language is most prevalent for writing OSes will require any other 'proper' programming language to have some sort of interface to enable system calls."

      Yes, but that interface doesn't have to be the same language. You can write something in C and provide a different interface. This can have advantages in application portability and language usage. I think that's the main suggestion (or more likely lamentation) of the person quoted in the article. Many systems create an interface that doesn't require all applications to link with them, but most common OSes have taken a different view.

      1. yetanotheraoc Silver badge

        Re: Umm

        "Yes, but that interface doesn't have to be the same language."

        What language were they supposed to pick for the interface? It sure couldn't have been Rust, as that didn't exist yet. It seems weird to me that C provides a way to call into C, *and* a way to call out of C, and gets criticized because the language it uses is C. Oh, no! It undulates!

        1. doublelayer Silver badge

          Re: Umm

          They could create one. I am not saying they have to, but even if I argue for the point they made, your objections don't apply. They did not ask for Rust to be the IDL used by operating systems. They said that C wasn't good at that purpose and could be improved. Since Rust takes a lot of its influence from C, it's likely it wouldn't be great at it either. It's not new to use a language for interfacing that wasn't used in writing the system, and other operating systems have done it.

          "It seems weird to me that C provides a way to call into C, *and* a way to call out of C, and gets criticized because the language it uses is C."

          C doesn't have ways to call out of C. Other languages link to C stuff and call in, but you never see a C program directly linking to Java or Python. It can link to libraries written in other languages that have chosen to have a compatible format, but in each case, the other language has to change itself to match C. If a C program contains a component in something that doesn't follow the standard C linking structure, a different interface language is used for that connection. This doesn't make C bad. It makes your compliment wrong.

    3. vincent himpe

      Re: Umm

      What's wrong with Goto ? your nicely crafted C code gets translated into assembler.... which is full of 'goto'. JMP operations in all their flavors and variants are nothing but a goto.

      your nice do loops and while loops and switch cases all get converted to JMP operations.

      Any complex case statement gets converted to a jump table.

      There are even specific instruction like SZ : Skip next opcode if A register is zero .

      SZ

      JMP true_label

      JMP false_label

      if the register is zero the next instruction (jmp true_label) is skipped so the JMP false_label is executed

      if non-zero the next instruction is executed. so it jumps to the true_label.

      There is your if-then-else.... nothing but a conditional goto

      All your subroutines and calls and any other stuff is translated into nothing but Goto, simply because that is how processors work !

      C fits like pliers on a pig for most processors. The concept of heap and stack and streams works on a pdp-11. Not so much on any other architecture. Tape and punch card streams have gone the way of the dodo. Its all block access these days.

      1. Roland6 Silver badge

        Re: Umm

        >What's wrong with Goto ?

        Whilst there many good texts on this, fundamentally this is about structured programming and good practice. If you are using a high level language you should be taking into consideration all possible parameters and paths through your code and use the Gotos implicit in the While, For/Do, If, Case and '}' languages constructs. By doing this you are using the compiler to avoid the resulting code exhibiting behaviours associated with missing control flow instructions.

        1. Eclectic Man Silver badge

          Re: Umm

          "Whats wrong with GOTO? "

          I agree with Roland6 (have an upvote on me).

          Well, I suggest using GOTO <line number> for all the navigation in a decent sized (lets say 10,000 line) program in BASIC (not Visual BASIC, good old original BASIC). And then try debugging it and have to insert lines to cope with any issues (I hope you had the discipline to increment your original line numbers by 10 instead of 1 as you'll be editing it until the cows have come home, chewed the cud, slept, been milked, been out and come home again.

          It is fine for the source code to be compiled into assembler using GOTO's, as that is an automatic process and hopefully the people who wrote the compiler ensured that it can count properly, but doing it all by hand? Been there, done that (on only short programs), NEVER AGAIN.

          1. yetanotheraoc Silver badge

            Re: Umm

            Can't you just write a script that tells you when the line number is not equal to the line number?

            1. Roland6 Silver badge

              Re: Umm

              Goto line number was a feature of Basic.

              However, on the i86 architecture you wouldn't actually know the final location until the code had been linked and located and thus whether pointers etc. had been resolved to 8, 16 or 32 bits and thus opcodes likewise selected to support 8, 16 or 32 bit operations...

              Fortunately, the Intel ASM did implement labels so you could Goto <label> and the linker and locator would do the math and add the missing bits.

      2. dafe

        GOTO considered harmful

        Dijkstra observed that people who use goto often make a mess of code. The kind of mess for which the term "spaghetti code" was invented, the kind that breaks flowcharts.

        The title of Dijkstra's article was Wirth's idea, I'm told.

        Anyway, Dijkstra formally proved that gotos aren't needed for code. Informally, Hoare and McCarthy had done the same. And experience confirms that code without gotos is cleaner.

        C is the only reason why gotos are still in use. Even though tail recursion optimisation and function inlining can produce efficient code from small functions, C compilers shy away from refactoring the input just so that the debugger can step through the code line by line, because apparently that is better than proper debug output.

        So gotos in C are more efficient only because C compilers are not.

    4. This post has been deleted by its author

    5. jake Silver badge

      Re: Umm

      A computer language with GOTOs is totally Wirthless.

      1. adam 40 Silver badge

        Re: Umm

        I used GOTO's occasionally when contracting, and I bet it made me considerably Ritchier than yawwww

    6. Anonymous Coward
      Anonymous Coward

      Re: Umm

      And it seems that the real problem, like the linked articles detail, is that you can't "speak C to C" - and, originally more importantly, "speak C to POSIX system interfaces" by following a standard, because there are so many incompatibility points in the ABI. Therefore, they have to support 176 different targets (x86_64-uwp-windows-gnu, x86_64-uwp-windows-msvc, ...) just to "speak C to C".

      But every non-C language that has traction in the real world has had to overcome this big barrier to entry. So they can, somehow, talk to each other over "C" interfaces, even if there is no actual C code anywhere. And things generally work... until something changes and mysteriously things no longer work, and then there is hell to pay. As a consequence, C standards can no longer change in any direction, basically do any changes at all, even to shore up any language specification errors.

      So the use of C as a programming language is never going away. But as an interoperability layer... maybe there will be a tipping point when the C-style interfaces become second-class citizens. If something is unsustainable, at some point it will not be sustained.

      1. Roland6 Silver badge

        Re: Umm

        >Therefore, they have to support 176 different targets (x86_64-uwp-windows-gnu, x86_64-uwp-windows-msvc, ...) just to "speak C to C"

        It will be the same for Rust et al unless the language is only intended for a single CPU architecture, say x86-64, in which case they won't replace C on other platforms such as x86-32 and ARM.

        Looking back, probably a mistake was to define the POSIX API's using the C syntax, as per the Unix reference manual...

        1. ssokolow

          Re: Umm

          176 is the line count from `rustc --print target-list`, which is why they delegate as much as possible of it to LLVM.

          In essence, the argument is that the underspecified nature of C has turned either LLVM or GCC or MSVC implementation details, depending on the platform, into the new "Whatever Internet Explorer 6.0 does is correct".

  16. Doctor Syntax Silver badge

    "My problem is that C was elevated to a role of prestige and power"

    I think "achieved" fits better than "elevated to".

  17. Electronics'R'Us
    Flame

    As we used to say...

    "Rust and Swift cannot simply speak their native and comfortable tongues – they must instead wrap themselves in a grotesque simulacra of C's skin and make their flesh undulate in the same ways it does."

    <sarcasm>

    Would you like some cheese with that w(h)ine?

    </sarcasm>

    1. Dave559 Silver badge

      Re: As we used to say...

      That is quite an exquisitely worded whine, mind you, I'll certainly grant it that!

  18. Anonymous Coward
    Anonymous Coward

    Mmmm. Undulating flesh. What an apt description of some of the code I've seen over the years! :)

  19. Pete 2 Silver badge

    Meanwhile in 2040 ...

    ... there will still be the same old my language is better than your language (grandad!) arguments. Ones that are criticising rust, go and all the other trendy stuff. Pulling them apart and inflating their inevitable weaknesses into major fashion faux pas.

    Even though almost all of the platforms those quasi-religious arguments will be taking place on will still be written in C

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Meanwhile in 2040 ...

      > the platforms those quasi-religious arguments will be taking place on will still be written in C

      Which is the actual point of the article.

      It is not about C as a programming language. It is that C is no longer just a programming language.

      In addition C is now an IDL, not only for programs to interface with the OS, but for programming languages to interoperate with each other.

      1. JoeCool Bronze badge

        Re: Meanwhile in 2040 ...

        "C is no longer _just_ a programming language"

        How much of these comments would have been obviated had that been the headline ? About 80% by my count.

        No to mention that the subtitle could easily be better stated

        "as a defacto IDL it has distorted <x> "

        So let's go see what the article author has to say :

        "Now you might reasonably ask: “what the fuck does your problem have to do with C?”

        It wouldn’t if C was actually a programming language. Unfortunately, it’s not, and it hasn’t been for a long time.

        Let's ignore the meaningless, gratuitous use of fuck, and note carefully that the author REPEATS THE THING THAT EVERYONE AGREES IS DELUSIONAL. Good start ! I get trying to grab attention, but when that completely detracts from your other points, maybe not a good choice.

        I will skip past Aria's meandering list of exagerations, uninformed rants, mis-allocated issues and off-topic whines, and go straight to the IDL concerns.

        "C Doesn’t Actually Have An ABI"

        Weird, no mention of IDL responsibilites. Is that because that's ALSO not the job of an IDL ? I wonder what's the CORBA view is ? ...

        IDL makes a strong separation between the specification of an object and the implementation of that object. Programmers who use the

        interface of an object have no idea whatsoever how that object is implemented. For example, the object doesn’t even need to be implemented

        using an OO programming language.

        This is called “Separation of Interface from Implementation,” and is a very important concept in OO in general and in CORBA in particular.

        So the reductive thesis is: When Unix was rewritten in C, it's APIs were documented in C - it's implementation language, and because the CORBA IDL was only available 15 or so years in the future, became the defacto IDL.

        There's not much meat on those bones. The "ABI" issue is not resolved by the language or the IDL. I don't see what's stopping the Rust group from creating an IDL and ABI that performs as needed, and then they can support that SPEC on all of their choosen platforms. Only the most myopic allow their understanding of interface _concepts_ to be deternmined by the spec language.

        Honestly, Aria's argument is so poorly constructed and elucidated (dare I say, intellectually dishonest) that my fake intelluctal spouting BS filter goes off, and my time-is-valuable self protection relays blow pretty close to the start of the article.

    2. doublelayer Silver badge

      Re: Meanwhile in 2040 ...

      "... there will still be the same old my language is better than your language (grandad!) arguments."

      Yes, but there will also be the "My language is real and because you use something else, you're pathetic" arguments, often from people who use something else but don't admit it. Both are cringe-worthy. There are cases for most languages that exist today, and just because someone uses one for speed, easiness, portability, or even familiarity, they probably are doing that for a reason. The argument that you must use Rust because you are likely to do your memory management wrong eventually is annoying, but so is being told that if you don't use C, you must not understand how computers work and you're thinking wrong.

  20. Flocke Kroes Silver badge

    C int

    C has to deal with lots of different hardware.

    example 1: C requires sizeof(char)==1 but is does not require char to be 8-bits. I have used CPUs with 16-bit chars because adding 1 to a pointer moved forward 16 bits in memory.

    example 2: In C, int is the size of a register. Put a char or short in a register and it will be sign/zero extended to the width of the register. Any subsequent arithmetic will be done using the width of a register. C makes it clear that if you ask for an int, the number of bits you get will be CPU dependent because that is a fact of life. Worse than that, the width of a register is somewhat subjective and some CPUs give the OS the opportunity to decide. If you want a particular size, #include <stdint.h> and use uint32_t, int64_t or whatever.

    "Fixing" the unknown size of int in new languages requires a layer of abstraction between the program and the hardware. That layer of abstraction eases programming at the cost of performance (or portability problems). C's unpredictable size of int is not wrong or bad in absolute terms but may be problematic or beneficial depending on the use case.

    I would be nice to believe we all know about: printf("%d\n", -9223372036854775808); If you don't, TRY IT NOW. Yes in theory it could be fixed but modular arithmetic is a massive performance boost over "infinite" precision and programmers are supposed to be clever enough to keep this things in mind.

    1. bombastic bob Silver badge
      Devil

      Re: C int

      don't forget that on ARM the char type is unsigned, but signed nearly everyplace else.

      For this reason I have been using the various sys/stdint.h definitions like 'int32_t' etc. for a while now, to avoid the kinds of ambiguity that can arise from various CPU architectures and implementations.

      1. Dan 55 Silver badge

        Re: C int

        Then again, if you're dealing with code which used time_t you got a 64-bit upgrade for free whereas if for some reason dealing with code which used int32_t you're screwed.

        I say "for some reason" but it's always because the genius that originally programmed it didn't separate types for data external to the software from types for internal variables, so instead of just changing the bit which deals with file or network format you have to go through the whole lot.

      2. Mike 16

        Re: C int (or char)

        "signed near everywhere else"?

        I have run into unsigned char on IBM 360 (and descendants down to now) and on some game platform (maybe Sony, long time ago). If you ponder why, consider

        1) C has historically bad at signed arithmetic, but we might give it a pass because it is not alone. A _lot_ of computers and languages have been bad at signed arithmetic. Stuff like the ones versus twos complement wars, which included the question of whether -0 was negative, or zero, or both (CDC 6x00, IIRC) Oh, yeah, signed magnitude [1] , which was so popular during the dawn of compilers that a lot of language/compilers went out of their way to try to provide the illusion of being signed-magnitude. There's a reason that C shoveled a number of things (e.g. rounding of division of a negative number by a power of two) into the "implementation defined" heap. until C89 (IIRC) decided that "it shall be the way we say, for all machines, and if your machine divide does not give this answer, neener-neener"

        2) Believe it or not, there are 8-bit character sets that actually use all 8 bits. Imagine that!

        [1] one of my GoTo examples of how weird architects can make arithmetic is the Univac 7900, which has both +0 and -0, and _many_ values in between, as it is a bi-quinary (5421) machine that collates half the "un-digits" (4-bit digits that are not 0-9) below zero and half above 9. The IBM 650 avoided that fate by allowing only ten valid values to be encoded in each of its 7-bit digits.

    2. Someone Else Silver badge

      Re: C int

      In C, int is the size of a register.

      True dat. But understand that there are so-called "programmers" out there who don't know what a "register" is. Such "practitioners" couldn't possibly understand, then, why C can't support an int of infinite length and precision.

      {...]and programmers are supposed to be clever enough to keep this things in mind.

      Methinks you assign too much capability to the current crop of copy-pasta artistes who call themselves "programmers".

      Besides, thinking itself is close to being outlawed in the 21st century...

      1. jake Silver badge

        Re: C int

        "there are so-called "programmers" out there who don't know what a "register" is."

        Round about 2000 I started interviewing "programmers" fresh out of school who didn't know what the heap and the stack are (much less how the compiler uses them) on a fairly regular basis. Nowadays it's normal for the youngsters to have many gaps of that nature in their education. I fear we are losing something very important that is going to prove to be almost impossible to get back.

        1. adam 40 Silver badge

          Re: C uint

          is what I call them

        2. Eclectic Man Silver badge
          Coat

          Re: C int

          A 'register' is what the teacher takes at the start of every lesson, isn't it?

          (I'll get my coat, its the one with a catapult in the pocket.)

      2. fajensen
        Coat

        Re: C int

        OTOH - There is an opportunity in every thing.

        I have ocasionally made windfall money helping getting a Board Support Package (BSP) to run on a new system, because many, very good programmers in other ways, does not understand "ancient" stuff. Like how to tell the linker to not shit over the interrupt vectors, the embedded os shadowing of important library functions, not-implemented interrupt handlers and so on.

        1. Someone Else Silver badge

          Re: C int

          Lord help us when people like you die (or retire...preferably the latter comes first!)

    3. ssokolow

      Re: C int

      > example 2: In C, int is the size of a register.

      Not on 64-bit systems, generally.

      To ease porting software which picked up assumptions about the size of ints, 64-bit Windows uses the LLP64 data model and all major Unixy OSes (including Linux, the BSDs, and macOS) use LP64 when built for 64-bit... both of which specify int to be 32-bit despite the CPU registers being 64 bits wide.

  21. Someone Else Silver badge

    Somebody call a Waaaahmbulance!

    So, some self-important ivory-tower mucky-muck wailing about how their favorite panacea-du-jour language hasn't received the respect, adulation and market penetration they think it deserves. Wow. How quaint. Now, take your self-righteous blubbering and get offa my lawn! I got real work to do.

    And you know what? That real work may likely require C...it probably will not require Swift or Rust. Get over it.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Somebody call a Waaaahmbulance!

      The author of the blog post is a member of two implementation teams for one wildly-popular programming language that emerged from the Mozilla project, and the other of which is Apple's replacmenent for Objective-C.

      As far as I know, they have nothing to do with any kind of academic research project and are working coders in widely-used public-facing projects.

      1. Someone Else Silver badge

        Re: Somebody call a Waaaahmbulance!

        The author of the blog post is a member of two implementation teams for one wildly-popular programming language that emerged from the Mozilla project, and the other of which is Apple's replacmenent [sic] for Objective-C.

        You say that as if it's supposed to be impressive or something.

        Perhaps it's not as "wildly popular" as you (as in all y'all") would have one believe. (If it were, the blog whine post wouldn't be necessary, now would it?) And WRT Objective-C, two comments:

        1) The same claim has been made about JavaScript -- we all see how well that worked out; and

        2) What took you so long?

  22. Boris the Cockroach Silver badge
    Gimp

    C is a tool

    much like a spanner, or a hammer, or a excell macro designed to forecast profits for the next 3 months.

    So if I want to undo a nut, I dont hit it with a hammer or try putting different parameters into excel , I use the damn spanner

    And if the boss wants to see how much money we'll make I dont ask the spanner

    Until people accept that ALL programming languages are tools and you pick the right tool for the job, we'll always end up with "my language is better than yours"

    And forcing the wrong tool for the job into doing the job just makes things worse in delivery, understanding the problem and finally time spent forcing the tool to do something its not designed to.

    Icon.. because some people prefer S&M .....

    1. Someone Else Silver badge

      Re: C is a tool

      Hey, I know a guy who wrote the code for the embedded controller for an industrial laser printer in COBOL...

      True story! To be fair, this was prototype code, and the guy was a sales guy who fancied himself as a systems engineer. The code was soon replaced by a proper driver (written in C, so, raspberries to you, Liam and Aria) before the printer was put into production.

      1. Mike 16

        Re: Embedded COBOL

        That's impressive.

        My own foray into COBOL was an attempt to "port" Crowther and Woods Colossal Cave into COBOL. An effort abandoned after the department that had a machine with COBOL found out about it.

    2. fg_swe Bronze badge

      Disingenious

      There are vast domains where C could be replaced with memory-safe languages. It should be done for the sake of shoring up security of cybernetic systems.

      1. fajensen
        Flame

        Re: Disingenious

        Nah. I should be done to demonstrate the merits of doing it instead of the boring whining and waffling about the sorry state of world affairs, while waiting for somone else to do the work!°

  23. martinusher Silver badge

    C is a systems programming language

    I keep telling people this but most people are now too young to have any idea what this concept means so I just describe it as a 'sort of macro assembler'. This type of language is designed to write the innermost layers of a system's software, the part where the software/hardware interface is really a matter of architectural choice. The fact that you could write complex applications in it is irrelevant; you could also write them in assembler (but you'd be daft to do so).

    Once you've got the processing hardware wrapped up then you can hand the entire thing along with a nicely polished language or two for the applications writers to use.

    To think in these terms requires you to have some idea about how hardware and software works together -- how you partition it between the various elements to give the best mix or price (complexity) and performance for the target applications. There's no mystery to this; unfortunately, though, pragmatism and the ability to make a huge pile of money got in the way with the mass production of systems based on a wholly unsuitable processor and bus architecture and a software environment that was built around the limitations of that environment. (The PC was both a supreme moneymaker and an architectural disaster). Now we've got a whole new generation of programmers, all experts, and all convinced that their language is the new, bright, shiny thing (they all push Rust and what-have-you and probably have never heard of Ada -- why it exists, how it came about and so on; yet another round thing with an axle is invented....).

    I think I shall now go back to warming my toes over my valves and dozing off to the chattering of relays.....

    (PS -- If you think programming in 'C' is a bit elemental, try VHDL...)

    1. fg_swe Bronze badge

      ALGOL

      Once upon a time there were Algol-based mainframes with much better memory/type safety than C.

      https://en.m.wikipedia.org/wiki/ICL_2900_Series

      https://www.infoq.com/presentations/Null-References-The-Billion-Dollar-Mistake-Tony-Hoare/

      C and Unix are just one branch of applied computer science and it probably won out due to cheapness. Not because of robustness, elegance, security.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: C is a systems programming language

      > I just describe it as a 'sort of macro assembler'.

      "C Is Not a Low-level Language

      Your computer is not a fast PDP-11."

      https://queue.acm.org/detail.cfm?id=3212479

      1. Roland6 Silver badge

        Re: C is a systems programming language

        "A programming language is low level when its programs require attention to the irrelevant."

        Not sure what there is about assembler that is 'irrelevant', so it would seem Assembler, by this definition, isn't a low level language.

        The paper doesn't make a case for C to be classed as a high-level language, just as someone has coined, it is a goldielocks language, just sufficiently removed from any specific machines assembler to be portable, but not so far removed that it can't be used to to achieve some assembler style optimisations.

        What the paper does do is to illustrate that modern high-level programming languages don't easily map to the pipeline and cache models implemented in modern CPU's. Interestingly, I don't remember seeing an assembler that also nicely maps to these CPU attributes, basically a programmer has to know about the these attributes and design their application and write the code accordingly to extract the best performance from these attributes.

        What is clear taking full advantage of these CPU attributes requires greater complexity in the code generator.

      2. trindflo Bronze badge
        Thumb Down

        Re: C is a systems programming language

        The paper you site does indeed say C is not a low level language. My definition of low level is that you have assembler level access. The author claims it doesn't count unless your language supports the intended final product system and nothing else.

        My considered opinion of that article is bollocks.

        It implies every target system (not just CPU) would need its own low level language for the language to count as low level.

  24. captain veg Silver badge

    size matters

    C's great because it's so small. An ordinary human being can realistically master every aspect of it in a sensible.amount of time.

    I like JavaScript for the same reason. You have to fight past the bonkers syntax (which is, ironically, C-like), but the language core is small enough to fit into a human brain.

    I'm not sure that this is true of Rust or Swift. I don't like that these languages heavily promote non-language toolchains to get stuff done. When I compiled my first C program it was certainly a surprise that it produced an executable named a.out, but the documentation told me quickly how to actually run it.

    -A.

    1. fg_swe Bronze badge

      Re: size matters

      Here is a rather simple language which is memory safe: http://sappeur.ddnss.de/manual.pdf

      Sappeur programs run on almost any POSIX-based computer which has a proper C++ compiler.

      Rust is taking the memory safety aspect to the extreme.

  25. mevets

    All right.

    So "C" is a grotesque language and CORBA IDL is the epitome. I think I can see where swift and rust got its good looks from.

    1. ssokolow

      Re: All right.

      Rust's syntax is a blend of C++ for familiarity and the ML-family languages it originally took heavy inspiration from.

      (The Rust compiler was originally written in Ocaml, and the weird 'a syntax for lifetime parameters is Ocaml's syntax for generics, equivalent to the <T> it also borrowed from C++ for normal generic parameters.)

      That's also why one of the old jokes was that Rust is an attempt to trick C++ programmers into using Haskell.

  26. fg_swe Bronze badge

    C: A Regression Of Applied Computer Science

    About 70 % of exploitable software bugs could be avoided* if a memory-safe language were used instead. This also applies for C code written by seasoned software engineers. Even though we humans are the most intelligent species on earth, we do make mistakes then and no. Unlike code generators, we have bad days due to family matters, project deadlines or just a mild case of flu.

    When C was conceived, there were already much more robust languages such as ALGOL around.

    In the year 2022 we have several efficient memory safe language options and they should be used as much as possible.

    * http://sappeur.ddnss.de/Sappeur_Cyber_Security.pdf

    1. JoeCool Bronze badge

      Re: C: A Regression Of Applied Computer Science

      Yes! So many cases where the C Spec told the programmer or project lead to put a hole in their design, and not think of the test cases. Or employ security testers. Or just grab that open source code and run it without audit.

      Because C isn't a programming language or tool, it's a App Generation AI.

      1. Someone Else Silver badge

        Re: C: A Regression Of Applied Computer Science

        You know, I don't see anywhere in my rusty, trusty K&R book that C requires one not to do proper design, pay attention to the details or stop using one's brain.

        Maybe that chapter ended up on the editing room floor?

        You can write FORTRAN in any language. Including Rust.

        1. Roland6 Silver badge

          Re: C: A Regression Of Applied Computer Science

          I don't remember K&R saying very much about design, in part because they intended only professionals who by definition should know what they were doing and hence know you need to design before you code...

          It took others, such as Alan Feuer with the C Puzzle book, to spell out just easy it was to write powerful opaque code.(*)

          My favourite interview question back in the 80's to prospective employees professing to be experts in C was to ask them to walk me through:

          i+++j;

          There are similar pointer expressions that can catch the unwary.

          (*)Back in the 80's I had a collection of these books and papers, we coded them up and included them in our tool's test suites, to confirm correct implementation of K&R.

  27. fajensen

    Well, IMO, the Rust and Swift people could clone something "simple" like FreeRTOS using only pure Rust or Swift. Then they will acheive a Purified Operating System, a good idea of how well (or not) their favorite language is suited for it and, I Bet, a whole new suite of security flaws for academics to write papers about.

    Whining about C is a somewhat stale tradition and brings nothing new, IMO.

  28. danmcb

    what have K&R ever done for us, eh?

  29. JoeCool Bronze badge

    Not a convincing argument

    Seems more like people waving their arms shouting 'look at me I have evolutionary opinions'

    "C isn't close to the metal " ??? let's ask the GCC manual what it thinks

    6.47 How to Use Inline Assembly Language in C Code

    Someone in commments provided a link the the article "your computer is not a PDP-11". It's more reasoned, BUT it's attribution of the problems to "C programmers" is flat out wrong. Specualtive execution is a hack on the part of the industry to wedge in parallelism without disrupting the skills required to keep on working.

    For years as a C programmer I looked for something that made parallelism better than threads and processes and semaphors. I found some decent academic proposals, but nobody wanted to create a break in the language/libs/build env.

    1. dafe

      Re: Not a convincing argument

      The fact that you have to inline assembler in your C code is proof that C is not close to the metal.

      I"ll give a simpler example: The overflow register. All CPUs have one. The PDP-11 had one. In C you have to write an if-condition that explicitly checks for overflow, and do so in such a way that the compiler recognises it as something it can replace with a check of the overflow bit.

      Speculative execution is taking advantage of the superpipelining in every RISC machine (or in the case of AMD and Intel, the RISC core in the CISC machine), parallelism that is already there in the hardware, but not reflected in the software (except in MIPS assembly).

      Parallelism is inherently safe in functional languages, and they don't even need the crutches that were invented for C. But it has been thirty years since bare-metal Lisp machines. Erlang is still in widespread use in the telecommunications industry, a functional language that is designed for network-transparent parallelism. But it uses a VM.

      1. martinusher Silver badge

        Re: Not a convincing argument

        >The fact that you have to inline assembler in your C code is proof that C is not close to the metal.

        Its not that a language is 'close to the metal' or not but rather that there are quite a lot of things down there that don't map to languages, especially if tweaking a bit or altering a value has side effects. (Cache control and processor speed registers are two that come to mind.) When all the abstraction is stripped away you're really just messing with hardware and practical hardware needs mechanisms for setup, diagnostics and control.

        (Come to think of it, I wonder if any of these enthusiasts who seek The One Pure Language have ever done anything with a processor's JTAG interface? Also, many processors include registers that exist just to be able to detect breakpoints and watchpoints.)

    2. Dan 55 Silver badge

      Re: Not a convincing argument

      Who's going to be the first to invent a language which enforces explicit parallelism in ifs in source code? I guarantee that language is going to gain about as much traction as a Russian tank.

  30. Anonymous Coward
    Anonymous Coward

    First C is stupid artice..1978?..

    Trying to remember the first time I saw a "The C language is stupid and my favorite niche language is the dogs b*llocks". I think it was in Dr Dobbs Journal around 1978. The language in question was Modula II if I remember correctly.

    Think of C as generic platform independent Macro Assembler and you wont go too far wrong. Dont need the speed and size of assembler or dont understand CPU architecture then C might not be the language for you. Lot of other alternatives out there. Most I have used over the decades, most you will never have heard of probably. Some I have even written compilers / interpreters / VM's for.

    Writing high level / low computational complexity stuff, then Java, Swift, Dart, etc are good enough. But guess why they all have C bindings. For when you need the speed and size of generic assembler. Which is C. Want a language / platform independent code library for your Java and Swift applications. Guess what its written in. C/C++.

    Saying that Swift is nice because the alternative , Objective C, is so $#@$ terrible. Was bloody stupid in 1988, now just pathetically stupid . So I like Swift.

    Rust. Why? Quite simply why?. It looks like the classic language that was created because some guys want to create their own language. No other reason. The fact that Firefox turned from an OK browser to utterly unusable pile of garbage with catastrophic memory footprints say its all. 300M/400M single page footprints? This is amateur hour stuff. Pure technical incompetence. I know this because I have seen exactly the same scenario play out many times before over the last six decades. Codebases destroyed by being ported / rewritten for some "better" hot new language. Always a fiasco. Guaranteed.

    There is nothing new under the sun. If you have a problem with a language the problem is you, not the language.

    Now back to debugging that x64 codegen for IR opcodes. Written in K&R C.

    1. ssokolow

      Re: First C is stupid artice..1978?..

      > It looks like the classic language that was created because some guys want to create their own language.

      Think of it as "We took a step back, derived a new successor to C from the experience with C++'s flaws (eg. all structs are POD... always. Use (struct, vtable) fat pointers when requested for virtual dispatch) and then baked in compile-time checking of what is now considered Modern C++ best practice."

      ...though, in all honesty, anything that looks weird to you is probably taken straight from Ocaml or one of its close relatives. (The original Rust compiler was written in Ocaml and they take pride in not being experimental. Examples of things borrowed from ocaml include the "let" and "match" keywords, the 'a syntax for lifetime parameters, postfix -> as the syntax for a function's return type, colon-based postfix type declarations for arguments, etc.)

      > let twice (f : 'a -> 'a) = fun (x : 'a) -> f (f x);;

      > The fact that Firefox turned from an OK browser to utterly unusable pile of garbage with catastrophic memory footprints say its all. 300M/400M single page footprints?

      Blame things like Spectre and Meltdown for that. Both Firefox and Chrome had to walk back a lot of memory-efficiency optimizations to ensure the same-origin sandboxing couldn't be circumvented. I believe Mozilla called theirs Project Fission.

      Rust actually improved memory efficiency and CPU parallelism when they replaced their old CSS engine with the one written for Servo.

      1. Anonymous Coward
        Anonymous Coward

        Re: First C is stupid artice..1978?..

        Sorry, dont buy it. Had a look at Rust when it first appeared and it was the classic case of YASFL (Yet Another Stupid F*cking Language).

        But there again when I look at code I am also seeing the probable executable asm. That comes of writing my first compiler codegen and low level prims (in asm) more than 35 years ago and looking at a huge amount of compiler output in the decades since. And that was a Common Lisp compiler..

        So that code snipped you quoted just gets a big yawn from me. Not only could we do that in Common Lisp more than 35 years ago (slightly different form) but I also know how to write exactly the same functionality in C. Or C++. Or assembly language for that matter. Because I once had to. Long long ago.

        As for Firefox. Used it since the 0.x days. Had the source code from the very beginning. And read it. Sure it was spaghetti but what not trivial codebase is nt. I had no problem finding stuff. Unlike say, gcc 1.x/2.x.. But once Rust got involved that was the beginning of the end. Its always the sign of a failed team when their solution to solving the inevitable problem with a mature codebase is, I know, lets invent yet another language.

        The day I heard about the Rust rewrite was the day I knew Firefox was heading for a death-spiral. Which is what happened. Because Rust does not solve one single problem for a mass market, larger user base, very large codebase application that cannot be solved by taking an existing language like C or C++ and enforcing very strict code and architecture discipline. Which is just keep it simple, consistent, and no fancy stuff.

        But thats not as much fun as writing your own language. Where you can do stupid coding tricks just like the ones you thought were so cool back in school. No time for that self indulgence, I have a product to ship. A small, stable, low maintenance product. Which sadly now Firefox is none of.

        1. ssokolow

          Re: First C is stupid artice..1978?..

          1. The "code snippet [I] quoted" is one of the Ocaml examples off Wikipedia... to show how much of Rust's superficial syntactic detail was borrowed from it.

          (And I agree that it's one of those classic functional programming "look how clever I am" examples... it was just the most concise option for showing off all the syntactic elements I wanted to demonstrate, such as the 'a generics syntax.)

          2. Rust's adoption wasn't "I know! Let's invent another language". It was "Hey, the CSS engine from that joint research project with Samsung (Servo) based on that language Graydon Hoare has been puttering about with for just under a decade is looking pretty good. Let's use it instead."

          Not fundamentally different from "Skia has outpaced Cairo as a 2D drawing engine. Let's switch to that." and I doubt you'd have paid it so much attention if it were replacing C++ with more C++.

          3. As Drew DeVault's The reckless, infinite scope of web browsers points out, it's become impossible for anything which achieves compatibility with modern websites to be a "small, stable, low maintenance product". The specs that need to be implemented are just too much of an overly verbose, over-complicated morass.

          > The major projects are open source, and usually when an open-source project misbehaves, we’re able to fork it to offer an alternative. But even this is an impossible task where web browsers are concerned. The number of W3C specifications grows at an average rate of 200 new specs per year, or about 4 million words, or about one POSIX every 4 to 6 months. How can a new team possibly keep up with this on top of implementing the outrageous scope web browsers already have now?

          1. Anonymous Coward
            Anonymous Coward

            Re: First C is stupid artice..1978?..

            Still dont buy your arguments. I have on several occasions been brought in to rearchitect / fix / rewrite major application code bases (7 figure plus paying customer userbase). Sometimes I said yes. Usually I said no. Because too many people involved wanted to do some gee whiz re-implementation in whatever the new fad language / technology was doing the rounds. Going with latest fads always caused project failures. Always.

            Simple, conservative and boring ships,

            I've been involved with low level HTML rendering code for commercial product since, well, 1995. With previous lead experience for high end WP/DTP applications. So I know exactly whats involved. Even did locked down browsers based on custom Webkit forks. Could have rolled by own, but it was good enough code when seriously nailed down. So why reinvent the wheel.

            Just because some committee / RFC group produces some new spec does not mean you should pay any attention to it. The stuff that will not work and can be ignored is easy to spot. Which is most of them. Does it solve a real problem for real people and is it clean and simple. Tick those three boxes and it might be worth looking at.

            I remember when I first saw the CSS 2.0 spec. Not going to work. Took a decade to make it kinda work. Same went for XHTML. And WSDL, And XML 1.1/XSLT. And so on and so on. No one want to remember the SGML fiasco. Or X400/X500 for that matter. Marshall Rose wrote a great book on that mess.

            Standards committees are like published papers. The vast majority of them should be ignored. Almost all useful standards evolved from real world solutions to real problems. De-facto being recognized as de-jure so to speak. So all those new W3C specs. Ignore them. 99.999% of web browser users will be happy as long as a well formed HTML page is rendered fast and clean,the embedded JS does not crash or lock their device, and core CSS is rendered correctly so that the few content creators who use it dont screw up the display. All that other junk like Web-Assembly etc. Who cares.

            Those W3C specs are purely for the amusement of the committee members who in my experience mostly have little actual practical experience of shipping working software and have a strong tendency to be cranks. No wonder most of these committees produce final results that just disappear.

            Writing a HTML rendering engine / JS execution environment that displays correctly the web pages viewed by 99.999% plus of the user-base page views is not a big deal. A low / moderate complexity codebase. Trying to implement something that will support the technical verbiage produced by all the W3C committees...

            Did you ever hear the story of the very first attempt to implement a PL/I compiler? The original language committee spec.

            Some things never change.

            1. ssokolow

              Re: First C is stupid artice..1978?..

              > Because too many people involved wanted to do some gee whiz re-implementation in whatever the new fad language / technology was doing the rounds. Going with latest fads always caused project failures. Always.

              Again, that's not what happened. Servo was developed as a research project and Firefox only switched over to its CSS engine after it had already proven itself fit for purpose.

              > Just because some committee / RFC group produces some new spec does not mean you should pay any attention to it. The stuff that will not work and can be ignored is easy to spot. Which is most of them. Does it solve a real problem for real people and is it clean and simple. Tick those three boxes and it might be worth looking at.

              The problem is that Google tends to be the company pushing all this complexity, and Google has the budget and market share to implement them in Chrome and then encourage developers to establish a "follow along or fall into a spiral of 'doesn't work so I switched to Chrome'" dynamic.

              That's a big part of why Microsoft Edge and Opera are now Blink-based.

              > Writing a HTML rendering engine / JS execution environment that displays correctly the web pages viewed by 99.999% plus of the user-base page views is not a big deal.

              I'm sorry but I'm going to have to ask for empirical evidence on that one because, from where I'm standing, that "99.9999% plus" number sounds like an "It's easy... until you actually try" situation similar to "Most people only use a fraction of Microsoft Word's features, so let's write a competitor".

    2. StrangerHereMyself Silver badge

      Re: First C is stupid artice..1978?..

      You're one of the obnoxious people I'm talking about in my post below this.

      The memory usage of Firefox has no relation to parts of it being written in Rust and 95% of it was written in C/C++.

  31. Steve Todd

    Sorry, someone who found CORBA in any way praiseworthy?

    It was a godawful mess that shouldn’t be touched with a 10 foot pole.

    I’m not convinced that any inter-language low level communication protocol (to, for example, let an application talk to the operating system) is going to be much cleaner without introducing huge performance penalties (which is what CORBA did in spades)

    1. dafe

      Re: Sorry, someone who found CORBA in any way praiseworthy?

      Using C voids to pass data between processes isn't exactly clean. CORBA is a Lovecraftian horror, but that doesn't mean that something like ASN 1 can't be both structurally sound and efficient

      1. fg_swe Bronze badge

        Re: Sorry, someone who found CORBA in any way praiseworthy?

        Thanks for mentioning ASN.1. In the spirit of "one tool for one purpose", it should have been used by CORBA. Instead they created volumes of convoluted, stuff which has already been mostly forgotten.

  32. KSM-AZ
    FAIL

    Confusing a language with an architecture

    Sorry, The author is confusing a tool (C) with a systems architecture. I love C, don't use it much any more, prefer to write most things I do in shell or PHP, . . . YMMV.

    As for a fully structured platform architecture, IBM solved this problem in the 80's with the AS/400 (now iSeries). They created a fully functional MI layer, layered on top of HMC/VMC layers creating a 64bit canvas with fully memory mapped IO and tightly defined data types. It was closed, but WAY ahead of it's time. It tended to be rather sluggish as well, They published a number of papers, but you basically compile everything to MI (Machine Instruction) and follow the guidelines and the tools all keep you honest. Everything works and talks in exactly the same way. The security models were interesting as well, everything was an object, and it didn't support a tree'd file structure... You could restrict any object many different ways, with ACLs, membership, location, ... At the time I thought it was arcane. Looking back I get it now.

    Frankly it's sad nobody ever adopted these ideas into an open source OS. As a ground up approach, It completely hides everything underneath, and makes all your programing 100% agnostic to the underlying hardware. Unfortunately the overhead on 80's hardware had everyone bypassing BIOS and scribbling directly on hardware for performance dooming us to the current plights.

    1. dafe

      Re: Confusing a language with an architecture

      There are academic open source operating systems that use capability inheritance to restrict child processes.

      But no, the architecture is not the point here. The point is that IPC, library calls, and system calls all inherit the insecurities of C, even where no actual C code is involved in either side, because the C conventions are the de facto standard that all programming languages adhere to for interoperability. This is how an Ada process reads an error text from another Ada process as an interger and Ariane 5 explodes, because C inhereted type unsafety from BCPL.

      1. KSM-AZ

        Re: Confusing a language with an architecture

        "But no, the architecture is not the point here. The point is that IPC, library calls, and system calls all inherit the insecurities of C, even where no actual C code is involved in either side, because the C conventions are the de facto standard that all programming languages adhere to for interoperability. This is how an Ada process reads an error text from another Ada process as an interger and Ariane 5 explodes, because C inhereted type unsafety from BCPL."

        You just made my point... -Architecture- IS IPC, library and system calls and all that nasty talking back an forth stuff, and device driver interfaces, and shared memory, and ... Like DBUS on a desktop is architecture not language.

        The article implies C is somehow awful because someone based a poorly defined architecture with it, to the point that when the underlying hardware became more robust the architectural definitions became perverted. The fact that nobody has redesigned the underlying architecture to allow for a more robust and well defined "MI" layer to handle all this is not a problem with the C language, it's a problem with not designing a robust extensible architecture to compile your programs against.

        Again, IBM solved this "Interface Definition" problem in the 80's so that their systems could become 100% CPU/hardware agnostic. It seems Apple has defined their systems architecture well enough to change CPU architectures 4 times over their lifetime. Moving from big to little endian is decidedly non-trivial. The Wintel world was a free for all based on the original IBM PC and intel based chips. The fact that nobody deigned to design a proper "MI" layer for commodity PC hardware has nothing to do with 'C' as a language.

        "Rust and Swift cannot simply speak their native and comfortable tongues – they must instead wrap themselves in a grotesque simulacra of C's skin and make their flesh undulate in the same ways it does." ®

        Paints an interesting picture. Create an MI layer, write the horizontal and vertical microcode, likely lot's of assembler. Make it extensible, define everything you can think of, ... memory map all your IO and IPC. The write your OS against the MI without 'undulating' in a 'grotesque simulacra' of any kind. It's been done before, I've been mildly surprised some university never made a project out of it.

        Whenever I read stupid sh*t like the above statement it makes me want to puke. You want to do IPC differently from 'C' with the latest language of the hour, shut the f*ck up and write it, and nail the API down. The bottom line on all this boils down to assembly language on your cpu of choice.

        1. anonymous boring coward Silver badge

          Re: Confusing a language with an architecture

          "Whenever I read stupid sh*t like the above statement it makes me want to puke."

          Don't beat around the bush! ;-D

  33. MacroRodent
    FAIL

    Interfacing to OS has to work at low level for maximum flexibility

    So they are complaining Rust or Swift has to adapt to C calling conventions to interface to the OS? That actually is far prefereable to the alternatives, simply because C is sufficiently low-level!

    If the OS interface were defined in a more high-level language, it would be even worse for other high-level languages than the one true language preferred by the OS writers, because of the added complexities related to the language's preferred calling sequence and memory management. In fact, implementers would likely find the OS interface a strait-jacket even for further development of the preferred language itself!

    There is precedent. Have you seen any "Lisp machines" lately? They had an OS and CPU tuned for Lisp, which made using something else pretty difficult.

    And Lisp isn't that fashionable these days. By contrast, "C machines" are going strong.

    1. dafe

      Re: Interfacing to OS has to work at low level for maximum flexibility

      You couldn't be more wrong if you tried.

      C is not low-level, it is not even the lowest common denominator, it is merely the de facto standard, and the great thing about standards is that there are so many to choose from. There are many C standards, and they don't even agree about what should be undefined.

      So passing data erases the type, because that is the convention. ASN.1 unambiguously defines structures, but C does not.

      And Lisp machines became extinct when an HP Tru64 ran a Lisp VM faster than hardware. Of course the VM had less control over the metal. But with the Raspberry Pi Zero 2 running Lisp with full control of the hardware, Lisp machines might just make a stealth comeback in the embedded space. Either way, Lisp is Turing-complefe, which means it can emulate any other machine.

      The most widely used language today is a Lisp dialect called JavaScript.

      1. MacroRodent

        Re: Interfacing to OS has to work at low level for maximum flexibility

        C is low-level, in the sense you can twiddle bits without resorting to libraries or non-standard features, and with good efficiency. And it does not require a complex run-time system to work. None of which is true of languages like Lisp, Java or JavaScript.

        There aren't many C standards, except for successive versions of the standard, like with any other standardised language (C89, C11, ...). C leaves some things implementation-defined, which is in line with its low-level nature.

        It is "lowest common denominator", because it is implemented for almost any CPU architecture anyone cares for. Not true of any other langauge (Assembler does not count, because it is by its nature totally different for each architecture).

        ASN.1 is not a good comparison because it explicitly is designed for defining data for interchange between systems, and for nothing else. You cannot program in ASN.1. You also cannot use it to define an arbitrary data structure at bit level, because the data defined in ASN.1 is BER or DER encoded in the implementation, which uses particular rules and metadata to ensure the receiving end can reconstruct the high-level data. By contrast, in standard C you can use data types with fixed sizes from the stdint.h header to lay out your struct very precisely and portably. Really the only things you cannot define is endianness and padding, but the last can be avoided by ordering the fields of different sizes suitably.

        JavaScript is not Lisp by any stretch. It lacks the key innovation in Lisp: using the same simple and elegant representation for both data and code.

        (Hmm, looks like I disagree with almost all your statements).

  34. Paddy
    Stop

    Moaning Minnie

    Get over yourself. Deal with the world you find rather than the world you wish for.

    There are many reasons that are well known that lead to C's current position.

    Show you can do better and people will beat a path to your language.

    1. dafe

      Re: Moaning Minnie

      Have you heard of Rust?

      1. Someone Else Silver badge

        Re: Moaning Minnie

        Yes...heard of it. Haven's seen that "beaten path" yet.

        ...and neither has Aria, which was the reason for their whine.

  35. Citizen_Jimserac

    I programmed in "C" for nearly 10 years, from 1984 til the mid 1990's.

    As is the case with human languages, of course their our ambivalences, differing assumptions and all the rest that goes with such anendeavor.

    I retired from software development in 2004. I've been waiting, expecting better languages to be invented to dramatically improve the accuracy and productivity of software engineers.

    It hasn't come. Rust and Swift suffer from similar problems. They inherit too much from C while focusing on remedying deficiencies instead of developing new algorithms and paradigms to make programming more like English with all the mechanisms necessary to interpret and question ambiguities.

    We do not think in for loops and procedure and if then else's.

    Sooner or later someone will come along and take a radical new approach and then software development will leap forward.

    Until them, people will remain stuck in the quagmire of enforced straightjacketed logic, structural protocols and byte minutae.

    Old Dykstra was part of the problem, attempting to enforce a totally logical stepwise refinement in a world where things interact chaotically and grow by unexpected mutations. His "Discipline of Programming" is a thing of beauty. He wouldn't have lasted a month as a programmer.

    1. fg_swe Bronze badge

      Functional ?

      First, the computer is a machine. To work correctly, it must be precisely controlled. You *could* instead "train an AI", only to find out the AI has some very funny behaviour in edge cases.

      For example, you can train an AI to drive a car and it will work on 364 days of the year. On day 365 it will experience an untrained scene and make a spectacular crash.

      Regarding loops, branches, instructions and function calls - they are the essence of imperative programming. Like a certain sweetness and acidity (and more) we know as apples.

      Then there are other fruits such as functional programming with much more mathematical expressiveness. Think of the sweetness and acidity (and more) of an orange.

      Some programming problems call for imperative, while others are best solved using functional languages. One day you want to eat an apple, the next day you prefer an orange. That does not mean oranges are categorically better than apples.

      Also, there are many more fruits such as logic programming. They have their niches, too.

  36. anonymous boring coward Silver badge

    Well, that was a load of irrelevant nonsense. About 20 years late as well. Sounds like some academics who don't know how to program.

  37. throwItAwayNow

    CLT

    Assume C is a real programming language.

    Since my language is the best language, any evidence to the contrary is thus a bigot (structurally) and arises from self-referential hegemonic discrimination.

    Therefore, by contradiction, C must not be a real programming language.

    QED

  38. itsborken

    Rust can go its own way, develop its own interface to the hardware and even develop its own OS to their specifications if they want to be purist. Time will tell if the approach is work the effort. Stop bitchin' and start doing.

  39. Rich 2 Silver badge

    What a load of…..

    “for almost any program to do anything useful or interesting, it has to run on an operating system…”

    Well that’s bollocks!! Ask anyone who develops “bare metal” (I hate that phrase but still) systems. You’ve almost certainly own a few of them - all doing “useful” stuff

    1. StrangerHereMyself Silver badge

      Re: What a load of…..

      I write lots of stuff for bare-metal as well, but what they're saying is basically true.

      People expect more and more functionality these days and after a while it becomes too unwieldy to write all of it yourself. That's when people started toying around with Linux and other OSS operating systems.

  40. trindflo Bronze badge
    Meh

    What C has that no other language does

    C is basically a macro assembler. Someone who is proficient at writing assembler macros can throw something together that very closely resembles C.

    This is its enduring feature. It can potentially do anything and at maximum efficiency.

    Of course it isn't safe. The command to shoot yourself in the foot is very straightforward. It also can shoot your foot as quickly as the machine is capable of.

    For better or worse, computer customers buy features and speed, not security or safety. C is how you get that performance and access to feature that are not preconceived in your higher level language of choice.

    1. fg_swe Bronze badge

      FALSE

      The macro processor of C is just one aspect of the language. There exists a modicum of type safety from a very basic type system in the C language. It just is not as comprehensive as it should be. Too many "undefined behaviour" cases.

      Note that in C++ you can use a powerful macro processor to replace the convoluted STL system. And its crazy error messages.

      E.g. use m4 to generate/instantiate container classes on the harddrive. If you have a bug, then you will get concise error messages inside the generated code. Much better to understand for mankind. If you are a masochist, you can even use cfront macro processor to perform this.

      One could even perform generic programming in C using this approach (e.g. a typesafe generic_sort() instead of the void* abomination).

      1. fg_swe Bronze badge

        Generic Programming Using the m4 Processor

        http://gauss.ddnss.de/GenericProgrammingWithM4.html

    2. StrangerHereMyself Silver badge

      Re: What C has that no other language does

      This is exactly the reason the mess we're in: lots of developers believing that the fastest speed is a must...when it isn't. Most users aren't going to notice a couple percentage points of slower program execution.

  41. drankinatty

    To write C isn't a programming language is like Putin saying "I won't invade Ukraine..."

    You see all types of hype pushing the latest rust or swift where each claim needs to be more spectacular than the last to draw attention to the hopefully new programming paradigms. Like Donald Trump saying "Fake News" or something equally moronic.

    The reality is C remains an incredibly elegant low-level language, but it doesn't come with training-wheels. That's why it is so blistering fast, but it requires you know how to program and validate every byte of your memory usage. Good and experienced programmers can. If you can't, you have no business programming in C and you may as well go find a language that comes with training-wheels to save you from your own shortcomings.

    1. fg_swe Bronze badge

      NOT

      According to your assertion NOBODY should generate C code, as we are all "fallible". The best software engineers write a bug then and now. The most advanced static checker tools, unit testing, module testing, HIL testing and valgrind runs cannot find all of the bugs. We have seen them in the Linux Kernel, in VxWorks, in Windows kernel and user mode, in loads of application level programs developed by seasoned experts.

      The first time Unix userland tools were run under valgrind checking were a revelation. Hundreds of bugs, which existed for decades in millions of actively used systems.

      Here is a list of the memory safety issues of C http://sappeur.ddnss.de/SappeurCompared.html

      I would argue that you are mostly correct, the amount of C code men generate should be very limited. There are NO perfect software engineers. There are only Code Generators (such as the Sappeur compiler) which are very close to memory safety perfection.

  42. Anthropornis

    How did I miss this ?

    It seems to me that vulture central's front page has got a lot worse recently - I usually look 4 or 5 times a week, but I completely missed this until it got mentioned on slashdot.

    Hey, anyone else remember when slashdot used to be good ?

    1. jake Silver badge

      Re: How did I miss this ?

      Slashdot still exists!

      Who knew?

  43. This post has been deleted by its author

  44. JoeCool Bronze badge

    Religous. Screed.

    This doesn't even read like reporting, it reads like a rehashed PR or HR statement.

  45. AndrueC Silver badge
    Facepalm

    What a load of twaddle.

  46. Binraider Silver badge

    The need for C interfaces to talk to the OS is one of the best arguments for a complete OS ground up rewrite in something independent of legacy issues around data types and structures.

    It's no different to migrating a legacy database to a new system - keep the old structure, keep the old problems.

    And to be fair, consider the origins of Linux. Something post MINIX and x386 to take advantage of the hardware.

    A Rusty OS is one way to achieve that. And there are some.in various states of development.

  47. StrangerHereMyself Silver badge

    Systems Programming Language vs. Application Programming Language

    I've stated this many times but C is a Systems Programming Language and not an Application Programming Language. The problem isn't C, but that we're using it as an Application Programming Language.

    Some obnoxious people simply can't live with the fact that their programs run even a trifle slower than C. They keep forgetting that the price for this speed is lack of safety, security, undefined behavior and obscure bugs which have cost millions of man-hours to fix.

  48. Abominator

    C is the white hero male of the programming world

    And everyone has a downer on the guy.

    It's just that it was used to build the modern world, but haters gonna hate.

  49. Anonymous Coward
    Anonymous Coward

    Another thing that isn't language

    Another thing that isn't language is referring to a known individual as 'they' in the context that's used in the article. People are either he or she. This particular person is male and therefore 'he'.

  50. DJ
    Coat

    I thought everyone knew this...

    https://www.cs.cmu.edu/~jbruce/humor/unix_hoax.html

    (Surely not the original posting, but you get the idea)

    Mine's the one with a first edition of K & R in the pocket.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like