back to article Dennis Ritchie: The C man who booted Unix

It was 1968 and students and workers were on the march, protesting against the Vietnam War, with the western world seemingly teetering on the brink of revolution. In the sleepy, leafy suburb of New Jersey's Murray Hill, a young maths and physics graduate was laying the groundwork for an entirely different revolution. For …


This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    A true innovator

    ... and a real "genius". One that Michael Bloomberg probably won't have heard of.

    Goodbye Dennis.

    1. Stephen Channell


      With much of the Bloomberg platform written in C & Fortran from the early days, it is likely that Micheal Bloomberg knew he was using K&R C (before ISO standardized it)

      1. Anonymous Coward
        Anonymous Coward

        In my experience ...

        ATC staff don't know or care what language Air Traffic Control systems are written in. Just so long as they work.

        Most bank staff don't know or care what technology underpins their daily working lives (and I can guarantee that the bank board doesn't either).

        So why Michael Bloomberg should, I completely fail to understand.

        Is/was he a coder before he was a politician? No. Its like saying Reuters managers know anything about the internals of the Reuters System.

        Fail to see your point.

    2. Anonymous Coward
      Thumb Up

      Agreed, a quiet hero in my mind.

  2. Admiral Grace Hopper


    For all the fun that you made possible with the tools that you gave us, thank you Dennis.

  3. Peter Simpson 1

    Torvalds and Ritchie?

    Does anyone know if they ever met? I'd like to hear what they had to say to each other...

    RIP, dmr, and thanks for all the C!

  4. Steven Roper

    Thus passes one of the true gods of computing

    Along with the likes of Alan Turing and William Shockley, Dennis Ritchie was one of the founding fathers of modern computing, and I was saddened to learn of his passing. Without his genius and insight, we would not have home and office computing today. He will be sorely missed by the IT community. Condolences to his family, may the great man rest in peace.

    1. Charles Manning

      The Hero Inventor myth

      These blokes were indeed heroes and great contributors to the industry and their input should be recognised.

      It is a complete myth that the industry would not have happened without them. The truth is that there were many parallel threads of development and if Ritchie had never been born his work would still have been performed by others. C is an Algol-family language and there was a rich primordial soup of languages and OSs.

      The same myth is encountered all over the place. If the Wright Bros had not invented powered flight we'd still be using ships to get from USA to Europe. Well no, there were many parallel threads working on powered flight. Wright Bros just scooped the glory. The same applies to much of Edison's work, Intel and the 4004 and so on.

      What gave us Unix and C was not so much Ritchie as the legal mechanism that forced Bell to release the code to universities. If that had not happened we'd probably have ended up on a very different trajectory that Unix/C. Perhaps PASCAL/Modula2 based, perhaps something else and quite likely something without many of the pitfalls of C.

      So if we really want to give thanks for C and Unix then perhaps - as much as it galls us - we should also thank the lawyers.

      1. Anonymous Coward
        Anonymous Coward

        We all stand on the shoulders of giants.

        Everything anyone ever did depended on what those who went before built for them, but I was disgusted when Steven Spielberg described Steve Jobs as "the greatest inventor since Thomas Edison". He didn't wield a soldering iron or write code, he was a CEO who hired teams of highly skilled engineers to invent things for him. Dennis Ritchie at least actually built stuff with his own hands and therefore deserves that much more acknowledgement for his achievements.

        1. andy 103

          His own achievements

          "at least [Ritchie] actually built stuff with his own hands and therefore deserves that much more acknowledgement for his achievements."

          Well if you're using that analogy then Mark Zuckerberg also deserves a lot of credit.

          The difference between Ritchie and people like Jobs (or Zuckerberg) is how much they care about making the sound of their own voices heard. Some people like to just create great stuff and don't do it for glory or credit.

          It's quite sad that many people don't even know who this guy was but also it's a compliment to how little he gave a shit about "marketing" himself!

        2. RightPaddock

          I think it was Spielberg who a few weeks after 9/11 said the US should use it soft power and beam Seinfeld into the Yurts of the all Taliban in Kathmandu.

          Shortly before 9/11 the most of Nepalese royal family was assassinated by one of there own number. The day before Spielberg's asinine suggestion the last remaining Nepalese princess died in a helicopter crash. At the time Mongolia had suffered a number of years of drought and very harsh winters, as a consequence most of the grazing stock died of starvation forcing the herdsmen to move into the towns & cities.

          The Taliban don't live in Kathmandu, the capital of Nepal. The Taliban live in Afghanistan the capital of which is Kabul, nor do they live in Yurts. In fact no one in Kathmandu lives in a Yurt, nor in Afghanistan. But 1800 miles away, Mongolian herdsmen do live in Yurts when they're out on the steppes, except when all of their animals are dead.

          So what the f**k would Spielberg know about anything.

          God bless you Dennis Ritchie - I wanted to write you an obit, but seeing Spielbergs name in this context made me mad. We met once, in Boston I think - long time ago, ken was there too, not sure about Brian.

          Rest in Peace brother - I'll always treasure what you and your colleagues Ken T & Brian gave to me and so many others

      2. Destroy All Monsters Silver badge

        >Wright Bros just scooped the glory.

        Then they patented it and got left in the dust.

      3. Jim Morrow

        no myth

        you couldn't be more wrong charles. dennis ritchie was a hero inventor twice over and you disgrace his memory.

        yes, if it wasn't for the lawyers, unix and c might not have made it into academia just as cs departments were buying computers for themselves. however dennis's inventions -- as well as ken thompson's and the others at bell labs -- just would not have caught on if they weren't any good. unix and c prevailed because they were (and still are) light-years ahead of the alternatives.

        those inventions are still going strong today ~40 years later. which is more than can be said for pascal and modula-2. c might not be as pure as those languages but when there's real work to do, c gets the job done time and time again. which is why it is so heavily used and the likes of modula-2 isn't.

        most decent operating systems and programming languages in use today are a direct result of dennis ritchie's genius. many of those that aren't are heavily influenced by his work.

        now maybe someone else might have invented unix. or c. but they didn't. dennis did. well, ken thompson shares the credit for unix. you would do well to remember that and be thankful for the wonderful platform they gave to the world.

        1. Field Marshal Von Krakenfart

          "those inventions are still going strong today ~40 years later. which is more than can be said for pascal and modula-2."

          I just wonder though, is the popularity of C as much to do with the popularity of products like Borland Turbo C and MickySoft C and their access to the windoze API?

          1. This post has been deleted by its author

          2. rciafardone


            Most stuff made with c don't even run on windows, or even have an interface for that matter. people seam to forget that almost everything with a chip also has some programming done and burned in a ROM somewhere, and C is the most common language to write that stuff. C is #2 on TIOBE but i wonder if a more specific count would be done on these small gadget if it would not jump to # 1 by a whole order of magnitude.

      4. Gaius


        And Al Gore

      5. Tom 13

        First, never thank lawyers for anything

        it only encourages them to multiply, and we've got enough of the parasites already.

        Second, whether or not someone else would have done something at the time is not what makes a leader a hero. The question is whether or not someone else could have done as well or better than they did it. When that leader consistently turns in exceptional results, they are indeed a hero, regardless of what else was percolating at the time.

  5. Reticulate


    I am no expert, but surely to say "Linus Torvalds announced his project of writing an open-source clone of Unix from scratch in 1991" is a bit misleading? A kernel, yes. But a great deal of the rest was GNU -- notably the C-compiler. I can see that Stallman has not made any friends in the past week, but that doesn't justify airbrushing GNU out of Unix history like Trotsky.

    1. Anonymous Coward
      Anonymous Coward

      Richard? Is that you?

    2. jonathanb Silver badge

      "Hello everybody out there using minix -

      "I’m doing a (free) operating system (just a hobby, won’t be big and

      professional like gnu) for 386(486) AT clones. This has been brewing

      since april, and is starting to get ready. I’d like any feedback on

      things people like/dislike in minix, as my OS resembles it somewhat

      (same physical layout of the file-system (due to practical reasons)

      among other things).

      "I’ve currently ported bash(1.08) and gcc(1.40), and things seem to work.

      This implies that I’ll get something practical within a few months, and

      I’d like to know what features most people would want. Any suggestions

      are welcome, but I won’t promise I’ll implement them :-)

      "Linus (

      "PS. Yes – it’s free of any minix code, and it has a multi-threaded fs.

      It is NOT protable (uses 386 task switching etc), and it probably never

      will support anything other than AT-harddisks, as that’s all I have :-(."

      He did announce just that, and started with the kernel. After getting the kernel working, other people ported the GNU system to work on it - it had pretty much everything working except the Hurd kernel which still even today isn't considered ready for production use.

    3. Daniel B.


      every time you run a GNU tool, it'll happily show the GNU credits. It isn't Linus' fault that the GNU project got stalled short of releasing the full GNU OS.

    4. Don Mitchell

      Ritchie > Stallman

      One of the reasons Ritchie is not more famous is because of Stallman's hatred for the Bell Labs group and the practical suppression of pre-Torvalds UNIX history. Remember, GNU stood for "GNU is not UNIX". They hated UNIX/C, because it came out of a corporate research lab, and most of all because eclipsed the TOPS-10/LISP culture that the MIT AI Lab dominated in those days.

      Yes, part of UNIX success was that it was free and open source. But it was also elegantly simple. I worked at Bell Labs Research in the 1980s and knew Dennis and Ken. I heard horror stories about MULTICs, how it was so complex and inefficient it could only "time share" two users at once. Ken came off that project determined to do something completely different. Simplicity is hard to achieve, and a lot of thought went into UNIX and its (now long-lost) philosophy of composing simple tools to perform complex tasks.

      There was a long rivalry between MIT and Bell Labs about this question of design simplicity. The hacker culture had a macho attitude of "my code is bigger than your code", while Ken and Dennis spent hours trying to boil things down to the fewest lines of code and the fewest necessary features.

      I remember one attempt to reconcile. The UNIX team invited Stallman to visit Bell Labs. I don't recall much about Stallman's talk, but everyone remembers that he picked his nose and ate a bugger in front of everyone. In return, Rob Pike was sent to give a talk at MIT, which he was not able to deliver, because Stallman and his friends heckled him.

      1. rciafardone

        Stallman is crazy...

        But for some reason i like him. i think we need a crazy bastard like him to counter the weight of the crazy bastards at the other side. DMR was in the middle, not because he compromised.. but because wasn't crazy.

        RIP DMR

  6. Anonymous Coward
    Anonymous Coward

    strcat smash DATA

    or at least cause a segfault for writing to RODATA. And a number of other nitpicks. But as it's an obituary, I'll leave them be.

    One thing I will say, and that's that as so often, history needs its actors to act or something else entirely will happen. Anyone smart enough could've made similar improvements, but it needed someone to do it there and then. Besides, you never know in advance where you'll end up and in hindsight it often looks far more obvious than it did at the start. In fact, if it doesn't look obvious (to you) afterwards, you may be doing it wrong. Both because elegance tends to be inherent (to quote another great man, "everything should be as simple as possible, but no simpler") and because you usually want to stay down to earth and off to the next adventure. If everyone else falls over themselves exlaiming they could or could not have done it, well, that just people.

    None of that changes that dmr left us with a few things very useful and yet he will be missed moreso for it.

  7. Doug Reed


    This man made a bigger contribution to the world of computing than Steve Jobs

    Fact: Apple's OS and toolset all came from Ritchie. Tablets, Mobiles mainframes all use Unix like operating systems and toolkits.

  8. TRT Silver badge

    Another loss of another pioneer.

  9. Sean Baggaley 1

    "AGOL"? What is this? The Grauniad?

    It's sad to see another of that pioneering generation heading off to the great operating system in the sky. However, I take issue with one part of this obituary: C is most emphatically *not* a "high level" language! I don't think Dennis Ritchie himself would thank you for referring to his "portable assembly language" as such.

    Also, the history of UNIX through the ages seems more than a little revisionary: BSD Unix (which really *is* UNIX) and the BSD-derived NeXTStep were already available by 1991, long before Torvalds' kernel made it out into the wild as an integral part of a viable operating system.

    I'd also disagree violently that Linux did anything to spread UNIX: Linux is no more "UNIX" than a Compaq PC *clone* was an actual IBM PC.

    Linux is a *UNIX-like* operating system, but it is not itself a bona fide version of UNIX and has, therefore, done more to *reduce* the prevalence of *reduce* the use of Thompson & Ritchie's UNIX operating system and its later releases than even Microsoft have managed. Apple's BSD-derived OS X has probably done far more for UNIX's popularity in the consumer space as BSD really is a direct descendant of T&R's UNIX and not a clone.

    This is analogous to Compaq's reverse engineering of the original IBM PC's BIOS, opening up the market for compatible IBM PC *clones*. Linux is a clone of UNIX. It is a "UNIX-compatible" OS.

    1. Destroy All Monsters Silver badge

      Yes, yes ---

      It quacks like UNIX®, walks like UNIX® has ~ an interface according to Posix, and slimy CEOs controlling a UNIX® use it for shakedowns for being UNIX®.

      It doesn't have the "historicity", but good enough: It's a Unix,

      (UNIX® is a registered trademark of The Open Group.)

      1. Anonymous Coward
        Anonymous Coward

        @Destroy All Monsters

        Following your logic - my Windows 2003 box running Services for UNIX 3.5 is UNIX.

        No, it's not and Linux isn't UNIX either.

    2. Anonymous Coward
      Anonymous Coward

      > "AGOL"? What is this? The Grauniad?

      They've "fixed" it. Now it says "ALGO".


    3. Peter Gathercole Silver badge

      @Sean Baggaley 1

      "BSD Unix (which really *is* UNIX)" - not in the most pedantic sense.

      A lot depends on your definition. BSD (which was originally a series of add-ons and modifications published in source) became a full OS distribution and split from Bell Labs UNIX around version/edition 6/7, and was never re-integrated (although SVR2/3/4 all added BSD features back into to the AT&T sourced versions).

      As a UNIX pedant, I would say that BSD is *NOT* UNIX. Remember the lawsuit that forced AT&T code to be removed from BSD, leading to BSD/Lite, FreeBSD and BSD/386, so it is difficult to justify the claim that BSD is UNIX.

      By comparison, HP/UX, AIX, Xenix, UNIXWare/SCO UNIX, Altix, SINIX and many more were derivatives of AT&T code, and passed UNIX branding tests, so could legally be called UNIX.

      I accept that a lot of people who were from outside the Bell Labs/AT&T world may well have seen BSD before any commercial version of UNIX, and may well have referred to BSD as UNIX before AT&T got commercially sensitive about the UNIX brand, but that does not alter the fact that it was a very early fork of UNIX which never gained UNIX branding. I am not arguing that BSD is no good, because clearly it is, but that it's claim to be UNIX is subject to interpretation.

      As far as I am aware, the only BSD variant that passes any of the UNIX compliance test suites is OSX!

  10. Anonymous Coward
    Anonymous Coward

    A man who...

    ... deserves complete admiration and respect. Ritchie's contribution to computing is far more significant than the one from Steve Jobs (yes, I realise that I am comparing apples to oranges, here (no pun intended)). It is a pity that Ritchie's passing will be overshadowed by Jobs'.

    1. Anonymous Coward

      Must be one of the very few occasions Job's got his 'product' to market first.

  11. Zippy the Pinhead

    Truly a man who revolutionized the computer industry

    And one who had a much greater impact than anything Apple/MS related. Yet I don't see Obits on the news and no fanbois crying and lamenting his passing

  12. Anonymous Coward
    Anonymous Coward

    Saw him speak at the largest UNIX meeting in the world

    Or was it Thompson? Anyway, think it was Ritchie - Massey University, New Zealand, about 1985: He worked out that, per head of population it was the largest UNIX gathering he had attended. I asked him what, in his opinion, UNIX's biggest defect is, he answered, "No concurrency", while crouching to feed the ducks. Asked why UNIX commands, during the main Q and A session, had such cryptic names, he said it was to make sure they did not get confused with anything else. Very approachable and filled the image of beard, long hair and sandals exactly.

    By the way, B may come from Bell labs., but it comes, actually, from BCPL, from University of Cambridge (England). The old question was, what would the successor to "C" be called, "D" or "P"? As for Algol, so casually dismissed: it was a decent language, affected Pascal, Ada, Modula and not without influence on C++ and Java. I believe Steven Bourne used Algol ideas, if not Algol, for his first Bourne shell implementation.

    1. Anonymous Coward
      Anonymous Coward

      Steve Bourne

      Before he went to Bell Labs, Steve Bourne got his PhD at Cambridge where he worked on an Algol68C compiler. The similarity of some constructs in the shell to those of Algol68 is not accidental! (Both case...esac and come from Algol68. Loops would have had do...od if "od" wasn't already a octal dump.)

  13. Destroy All Monsters Silver badge

    "C paved the way for object-oriented programming ... Today, C and its descendent C++ are a popular choice for building operating systems"

    That should be

    "C was refitted with object-oriented programming features, yielding something not unlike a gaily decorated truck out of Lahore ... Today, C and its descendent C++ are a popular choice for shooting oneself in both feet trying to build operating systems"

  14. blade-runner

    ASCII art candle ?

    Will there be an ASCII art candle vigil? Somehow I doubt it.

    Funny how the media are not shouting to the world about this compared to the recent Apple announcement.

    1. Owen Carter
      Thumb Up

      Done on a line printer I would hope ;-)

      An ASCII art candle vigil

      - that put a smile on my face, thanks.

      Not attempting to post one here due to a) copyright and b) no monospace.. but a good collection at:

  15. 4.1.3_U1


    This is a great loss. In many ways this touches me more than the death of Steve Jobs last week.

    Must find K&R books on shelf. Must not give up. Repeat.

  16. chr0m4t1c

    Six flavours of unix?

    You're joking, right?

    Six flavours doesn't even come close, we had that many flavours just on one site - which was particularly odd as we only had machines from two vendors.

    In fact the old Siemens-Nixdorf T35 actually had two flavours on one machine both available at the same time; you could just switch between the two with a simple command once logged in. (No, there was no virtualisation involved.)

    Fun times.

    1. Anonymous Coward
      Anonymous Coward

      Only 2?

      The Pyramid RISC had "universes" of different UNIXes, many as you like.

      1. RightPaddock

        God I'd clean forgot about Pyramid - what happened to them - I know I worked on one, money market system I think

        Memory's long past are fading,

        But in flesh or in spirit Dennis,

        Ken & Brian always burn brightly

        Everlasting into the long dark night

        Thanks to Dennis R, respects to Ken T & Brian K.

  17. Anonymous Coward

    Oh boy....

    Ritchie was the man. He was not a showman like Jobs. He was the real deal in terms of important impact on the IT industry. A genius. RIP

  18. BoredWitless

    A real innovator and one who genuinely changed computing, but who hardly anyone heard of, or the megalomaniac control freak Jobs - I know which one I feel that the field is the poorer for losing,

  19. Graham Bartlett

    Painful architecture

    "and oh by the way those aren't pointers to bytes, they're pointers to words"

    I met this on a microcontroller, not too many moons ago. And lo, there was much wailing and gnashing of teeth, and darkness was on the face of the softies who had to deal with this f**king stupid idea.

    1. Gideon 1


      Welcome to the wonderful world of embedded processors. We use a new and different architecture in every second or third project. If you don't like it, stick to desktops.

  20. Anonymous Coward
    Anonymous Coward

    If one was to compare the recent losses in technology to certain fictional figures then I would not consider it to be unfair to say that.

    Dennis Ritchie == Jesus

    Steve Jobs == Daniel Plainview

  21. Chris Gray 1


    dmr--; /* :-( */

    1. nyelvmark

      Chris Gray 1

      Surely that needs to be --dmr; to avoid temporal paradoxes?

      1. Anonymous Coward
        Anonymous Coward

        No, dmr-- is correct.

        We remember him as he was, not as he is now.

      2. byrresheim

        Sorry to pick nits but:


        1. Trixr

          Are you trying to say the plural of paradox in English is "paradoxa"?

          It isn't.

  22. Mark McGuire

    Unix V6

    Right now I'm taking a course focused on Unix V6 for the PDP11. When you look at the source of it you can feel the innovation radiating off it. Tracing the start up procedure is a one of a kind experience.

    Sad to see him go.

    1. jphb

      Oh happy memories. My first escape from mainframes, Cobol and Fortran

      (and ICL's George 3) was Unix V6, PDP/11's and C, happy hours reading the source

      code and eventually fixing two bugs in the kernel source. Gosh I could

      read the source code of an operating system - and understand it!


  23. Anonymous Coward
    Anonymous Coward

    I was introduced to him through a family friend, he was a quiet, deeply insightful as well as a very religious man. A great mind and man has been lost.

  24. Wile E. Veteran

    The passing of a true genius

    As a genuine old-timer, I remember when C was first released, and the first thing I did was run out and by the K&R C books, and later the K&T "Programming in Unix" book. I used K&R C for programming various projects and learned a lot from it. Far more elegant than Assembler, although some wags Referred to C as "Portable PDP-11 Assembly Language." No matter, it was powerful, elegant and highly portable. C on an 8080 anyone? I used it.

    The "cryptic" commands in Unix were strongly influenced by the fact the Teletype Model 33 was far and away the most common terminal connected to Unix systems. For those relative newbie, the 33 was an effective but a genuine PITA to type on. Also very slow,limited by its mechanical operational mechanism to 300 baud, max. Anything that could be done to lessen the amount of typing was a Really Good Thing hence the highly shortened commands made using Unix a whole lot more pleasant.

    dmr truly impacted the world of computing and impacted that world far more than the greedy, megalomaniac control freak that started that company named after a fruit.

    Requiescat In Pacem dmr and condolences to the family and the millions of users of your inventions. May your legacy and humility be remembered forever.

    1. fighne

      Teletype 33

      They used to make my fingers ache after about an hour typing on them!

  25. Number6

    And the Third...?

    Given that these things always seem to come in threes, who's next? Anyone who's contributed in a big way to computing had better be really careful crossing the road and avoid aircraft.

    1. Tom Maddox Silver badge

      Stallman. Killed in the study by an irate fanboi with a hurled iPad.

    2. Anonymous Coward

      Can someone please go round to Donald Knuth's house and check he's ok?

      This has been a bad week for geeks.

  26. Mage Silver badge

    Not a true High Level Language

    Really it did exactly what it was designed to be.

    A Machine Independent Macro Assembler.

    Hence the HUGE dependence on Makefiles and #define

    C was a great Achievement in the climate of early 1970s

    What did he do for the next 40 years though?

    <<If one was to compare the recent losses in technology to certain fictional figures then I would not consider it to be unfair to say that.

    Dennis Ritchie == Jesus>>

    Actually you can argue if Jesus was a Deluded Man, or God, or his followers made up stuff. But no shortage of historical evidence that he was a real person.

    1. Jim Morrow

      what did dennis ritchie do

      > What did he do for the next 40 years though?

      mage, you are showing your ignorance. after inventing unix and c, dennis ritchie co-wrote one of the major computing textbooks. he developed unix v8 which sadly wasn't allowed to leave bell labs because it was a threat to at&t's commercial but shit unix offering. he invented streams. which at&t turned into a train wreck for system v. he spent years getting the c language standardised. i think he was involved in the posix standards work too. he then went on to develop plan9 and inferno. if that wasn't enough, dennis ritchie lead the computer science research team at bell labs, one of the world's most prestigious research institutions.

      btw, that michaelangelo bloke did fuck all after painting the sistine chapel. what more did he need to contribute?

  27. Anonymous Coward
    Anonymous Coward

    Scratch "New York"

    Murray Hill is in New Jersey.

    My high school was a few towns away and I had a teacher who worked in Bell labs during the summer.

    She got us a tour (1971 - 72) of the labs and I had a chance to talk with a gentleman who showed us his little system he was working on (if IRC he said it supported 4 users). I still remember the PDP computer and Teletype 33 at the end of the table. Wish I had taken a few sheets out output!

    1. Chris Miller

      Murray Hill is in New Jersey, but it's a suburb of the City of New York. No-one commutes from there into Trenton.

  28. Ilgaz

    Some will say Plan 9 failed etc.

    The issue with Plan 9 is, it is way ahead of even today's common/general technology. You don't have anything (yet) to use the amazing features and the design of it. It is more like trying to use UNIX on Altair home computer.

    Once the technology allows the vision of these guys, we will hear something like "Apple considering to move from Mach/UNIX to Plan 9 at OS level" and nobody will be that surprised, some will even say "about time".

    I speak about stuff like seamless roaming between devices (including car computer) etc. and seamless distributed computing.

    For example, IBM happily uses Plan 9 on their Blue Gene monster.

  29. DV Henkel-Wallace

    Dont' go overboard; hardly the first general purpose language

    He was a titan, but don't go overboard: "General-purpose programming languages had not existed before C" is rubbish: Algol, and Fortran preceded it by more than a decade, and Multics (which dmr worked on) was written in PL/1 as were its applications.

    C was / is a good implementation of the idea, based on learning from those experiences, as dmr himself would say.

    In other words: there's no need to denigrate the giants upon whose shoulders he stood.

    1. Anonymous Coward
      Anonymous Coward

      So did BASIC

      ... by a good 4-5 years

    2. This is my handle
      Thumb Up

      Yes, but ....

      It's true enough that FORTRAN, COBOL, BASIC, and others pre-dated C, but C was different precisely because of the other exception (NO PUN:) taken to this very nice article: C is not quite a high-level language (HLL), but sort of straddles the ASM / HLL border. That's the good news -- and the bad. In the right hands it is a thing of beauty (which could be set of Unix as well). As a guy who taught the C class I once took (not at Uni; courtesy of an employer) : "It compiles itself." What he meant of course, is it's the sort of language you could easily write a compiler in. I wouldn't want to try that in any of the HLLs which preceded it, or those which came later. In fact, with a little help from the lex & yacc descendants, you almost wouldn't want to write a compiler in anything else.

    3. Michael Wojcik Silver badge

      Only one of a number of glaring errors in the article

      Yes, Clarke's "General-purpose programming languages had not existed" is a particular howler. Assembler, after all, is a "general-purpose" language (it's not a domain-specific one) when running on general-purpose hardware. Among higher-level languages that were going strong when C was invented and are still in widespread use you have FORTRAN, COBOL, LISP, BASIC, and PL/I (in historical order); plus others that have fallen into disuse since, such as ALGOL and C's own ancestors BCPL and B. And contrary to another commentator, C's supposed "high-level assembler" nature doesn't excuse this wildly erroneous claim.

      But that's far from the only false step in this piece.

      It's not true that the major general-purpose languages were meant to "lock virgin customers [or any other kind] in"; in fact, C exposed more implementation details to the application, and so C programs often required more porting effort. C did not "go visual" in any meaningful sense with Microsoft's Visual Studio, anymore than it "went turbo" with Borland's Turbo C. Others have already noted how ridiculous the "at least six flavors of Unix" bit is.

      I don't even want to start on "It is a high-level procedural language that used a compiler to access a machine's low-level memory and to execute, so it can span different platforms". There's enough wrong there for half a dozen posts. Nor does C contain some magic inherent power to make programs fast, as Clarke claims; and I have no idea what he means by "programming with C is also accessible".

      I've written software in C (and for Unix) since the mid-80s, and I like them. Certainly dmr made some terrific contributions to the industry - even if you agree with the criticism a number of people have levied against the use of in-band signaling in C (nul-terminated strings, printf formats, etc). I've been in Usenet conversations with him, and he always came across as knowledgeable, intelligent, reasonable, and friendly. It's a shame he's gone, and I wish the Reg could have done a better job with his obituary.

  30. Petrea Mitchell

    And yet, a less-revered part of the legacy...

    Allow me to be the first pedant to point out the sub-headline should be using strncat, to guard against buffer overflows.

    1. Chris Gray 1


      Raise your hands if you are aware that the length value "n" given to "strncat" says how many characters to take from the source string, and says nothing about the size of the destination buffer. So, it does not directly protect against buffer overflows. At least that's the way it is described on Gnu/Linux, and that surprised me.

      1. Paul M 1

        ... unless you use sizeof(destination) or equivalent for your value of n.

        1. Anonymous Coward
          Anonymous Coward

          sizeof(destination) isn't enough

          It would need to be sizeof(destination) - strlen (destination) if the buffer isn't empty to start with.

    2. Anonymous Coward
      Anonymous Coward

      Without the definition of obit, that's hard to say for sure.

      char *ritchie (const char *text)


      char *obit = malloc (strlen (text) + sizeof("Quiet revolutionary"));

      strcpy (obit, text);

      [ subhead is entirely correct and valid here ]

      return obit;


    3. diodesign Silver badge

      Of course, I should have used strncat()

      But strncat(obit->, "Quiet revolutionary", obit->text.max_length); is starting to get a

      bit unwieldy for a headline :-)

  31. mccp

    Visual C

    I don't get the Visual C reference. When I switched from RiscOS to Windows 3.1 back in the day, I had to leave Acorn's nicely designed visual tools behind and use the pathetic MS C Version 7. It was at least a couple of years before VC1.0 arrived, with tools that were less advanced than Acorn's.

    And don't talk to me about IDEs on Unices, I tried writing for Motif on DEC's Ultrix at the same time and I believe that's when I started to lose my hair.

    (Still writing portable C code that runs on any version of Windows as well as a bare metal embedded system (~400 SLOC today :)).

    1. Eddie Edwards
      Thumb Down

      What tools?

      I think you're seeing the past through rose-tinted glasses here. The only visual tool of note from Acorn was !FormEd, which came after VC++1.0, and when I left the scene in 1996, six years later than you must have done, there was still nothing approaching a decent IDE for C development.

      The C compiler sucked balls too.

  32. Will Godfrey Silver badge

    ... he was also a very nice person who cared about helping others , having great patience with newbies.

    There aren't may people like that, and now there is one less.

  33. Tanuki

    I never met you but you touched my soul....

    There's one less record in /etc/passwd tonight.

    [And Steve Jobs no longer has a giant's shoulders to stand on]

  34. Anonymous Coward
    Anonymous Coward

    Standing on the shoulders of giants

    We all are, gratefully. In the short life of computing, Dennis Ritchie was one of the very few who would qualify.

    RIP and thank you.

  35. Andus McCoatover

    Quote of the year?

    "( C is a poster child) for why it's essential to keep those people who know a thing can't be done from bothering the people who are doing it."

  36. Tanuki


    1. Peter Simpson 1

      I have the reprint of that

      Learned C and UNIX when I was at Data General. I was impressed by the apparent simplicity of the design (which is quite often a sign of carefully crafting) and its obvious power. I was lucky enough to have a co-worker who had been at MIT and returned waxing eloquently about how UNIX was the future.

      All I needed to learn C was the K&R book. Later, I took a course in which I received the BSTJ reprint. Then I got a Sun workstation , discovered USENET and shortly thereafter, Linux. It was all "downhill" from there.

      RIP, dmr.

      //there seem to be an awful lot of coats here with a copy of K&R in the pocket...

  37. Anonymous Coward
    Anonymous Coward

    "a humble and gracious man"

    yep, that's the one

    makes his intellect and achievements all the more admirable

    thanks for giving me the tools I didn't know I needed

    RIP dmr

  38. Mikel

    C is a beautiful thing

    Brilliant man. We could use more like him.

  39. John D Salt

    RIP a great man

    I studied at the University of Newcastle upon Tyne with Dennis' sister, Lynne. The family connection meant that we got Dennis to come and give us a talk on mechanisms of inter-process communication in Plan 9. So the department booked a medium isized room, and it was too small, there were too few chairs, and people were leaning in from the corridors in order to hear Dennis put forward his ideas, which he did brilliantly -- full of humour, and clear enough for anyone to understand, the mark of a man who had totally mastered his subject.

    I treasure a brief e-mail exchange I had with him, when I asked if the word "grep" owed anything to the Manx Gaelic word for a fish-hook, which it seemed to me would be appropriate. Sadly, it turns out not. But the great man was not too grand to answer a random question from an unknown postgrad.

  40. KrisMac

    " all who have been touched in some way by Dennis..."

    ..this pretty much includes the vast bulk of the present generation of living human beings, (certainly in the developed world)...

    The spread of computers into business and their development into effectively personal household appliances may have happend without the release of C andUnix, but I believe that it would almost certainly have been a much slower and much more painful progression. The explosion of development that we saw in the late eighties and ealy nineties might still be waiting in the wings to happen....

    Rest well Dennis Ritchie... we all owe you a great deal

  41. Majid

    Rip my friend..


    1. Anonymous Coward
      Anonymous Coward

      No no no!

      Only exit code zero indicates success, any other value would mean he'd failed!

  42. Dave Barnhart


    I still have the original "The C Programming Language: by Kernighan and Richie on my shelf. And you'll have to be a total geek to remember and appreciate "Why Pascal Is Not My Favorite Programming Language" written by Brian Kernighan.

    1. RightPaddock

      Here 'tis - now you can be a total geek too

  43. Richard Porter


    I'm somewhat surprised that the obit made no mention of Brian W. Kernighan, Ritchie's co-author of "The C Programming Language" in 1978, which was the bible for C programmers in those days.

    Now correct me if I'm wrong - I'm sure you will - but as far as I recall Solaris was Sun's version of BSD. SVR4 was an attempt to bring to gether the two main threads of Unix: BSD and Systen V. I don't think it really took off because Sun and HP carried on as before with Solaris and HP-UX.

    Now just in case you think cloud computing is something new, back around 1990 I was using Sun workstations. The software was installed on each workstation but all the data were held on fileservers so you could log onto any workstation on the intranet and access your own environment. And of course we had email, ftp and usenet. Mosaic arrived later.

    1. Chavdar Ivanov

      Sun's version of BSD was called SunOS, lived to about version 4.1.4 and was replaced by a BSD/SVR4 graft, which became known as Solaris. I remember hating this - for me at the time "Solaris" meant only a brilliant novel by a well known polish SF writer.

  44. Martin Usher

    Hardly a whisper outside the trade

    The news was -- and is -- still about Steve Jobs but I would venture to suggest that without the work of Richie and his ilk people like Mr. Jobs would be peddling used mainframes or cars or something.

    I'm going to be an anti-pedant, BTW. I don't see much point in using strncat when you're copying or concatenating a string that's a constant. User input, that's a whole different game....

  45. Asgard

    Thank you Dennis Ritchie

    I'm very sorry and saddened to hear this news. If it wasn't for Dennis Ritchie, I wouldn't be a C/C++ programmer, like so many other programmers. I think its fair to say, that almost all programmers owe him a huge debt of gratitude. He greatly helped changed the way most of us work because his work influenced so many languages that followed his work.

    Its also amazing to think of how much work and research has been done and derived from his pioneering work. Its mind blowing to think just about everyone on the planet has at some point been indirectly influenced by his work because so much technology has been influenced by his work.

    This is very sad news. :(



    printf("Thank you Dennis Ritchie and RIP... :(\n");


  46. Anonymous Coward

    A Mixed Legacy

    I know that every second programer and their dogs and cats use C. But it is also true that every second security-related bug can be attributed to various regressions built into C, namely buffer overruns, bad pointers, most primitive heap memory management and strings without length information attached.

    Compared to Assembler, C is progress, but compared to ALGOL68 I would call it a regression. One of these quick hacks which dominate the world because a major American corporation was behind it.

    Those who have spent serious time with a Pascal or Ada compiler know what I mean - many errors can be caught by a proper type system and runtime checks which only cost a few percent of runtime overhead.

    Unix, on the other hand, is a great invention and if anybody can create a useful version in Ada, I will definitely use it. The Unix command shell is orders of magnitude more powerful than the GUI-clickety-click way of controlling a system. To the expert, Unix commands are by no means cryptic, but elegant, concise and powerful.

    RIP Dennis Ritchie.

  47. Anonymous Coward
    Anonymous Coward

    Sad news.

    PS Isn't Bell Labs at the Murray Hill in New Jersey? It's not the same as the one in New York.

    1. RightPaddock

      Correct !

      The one in NY is on Manhattan Island, between 42nd & 30th, runs down from 5th Ave to the East River - if you know the Empire State Bldg is then you're over the street from Murray Hill. AFAIK Bell were never in the NYC Murray Hill.

      However, Bell Labs were in West St in Greenwich Village - the building is now the the Westbeth Arts centre. Don't think the software group was ever at that site - some exchange kit was there.

      I did some research into Bell when it morphed into Lucent and then got hoovered up by Alcatel.

      The loss of Bell Labs was a disaster for mankind - it presaged the coming of the anti-science crowd that's now infecting the planet. How could a country let something like Bell disappear in a puff of smoke - criminal. I think it was Carter who dealt the coup de grace to AT&T, Nixon was a mere toreador.

      I know there were good commercial reasons to break up AT&T, but Bell Labs should have been given a status like Livermore or Argonne :cry:

      Bye Dennis.

  48. Anonymous Coward
    Anonymous Coward

    Excellent article. More of this top stuff please.

    1. Red Bren

      More of this top stuff?

      I hope you're not suggesting we start culling pioneers of IT to feed your hunger for obituaries...

  49. Sandtreader

    Most popular language? By what metric?

    I keep reading that C is the second most popular language, and some suggestion that the most popular is Java. Based on LOC, I guess. That seems a little unfair since C is one of the world's least verbose languages and used in situations which demand a small amount of code running extremely quickly and reliably.

    Shouldn't the count be of number of instances of the software running? In which case, count a handful of instances of Tomcat for each corporate Java project and hundreds of millions for every embedded device, phone, TV, car dashboard, router, GPS (...) running a C-based RTOS or Linux.

    And what are Java VMs written in, anyway?

    1. jphb


      Not too sure about the methodology.

      And note that it's C excluding derivatives such as C++ (surely it ought to

      have been ++C !) and C#

      1. Sandtreader

        Thanks for the link, that's really interesting. It appears they are counting people-popularity (number of engineers, courses etc.) rather than LOC, projects or whatever. Interesting to see how C# has stolen quite a bit of Java's fire, leaving C almost back at pole position.

        But I wonder if this might favour old and university-course languages. For example, (who (uses-p `LISP)) any more??

      2. Sandtreader
        Thumb Down

        Metric = Google

        OK, scrap that - here is the definition of the Tiobe metric:

        Basically counting search results on +"<language> programming"! Worthless, surely? What's the betting C will spike next month?

        Well, here's my contribution to the index in roughly cronological order:

        BBC Basic programming ARM Assembler programming C programming C++ programming Javascript programming

    2. Michael Wojcik Silver badge

      Verbosity in programming languages

      Unqualified generalizations like "C is one of the world's least verbose languages" are worthless, particularly in a context like measuring SLOC, since that generally - by definition - doesn't include runtimes. Many programming languages have huge, feature-rich runtimes available to them as part of the standard language spec, which isn't the case with C. In domains where those runtimes cover a large portion of the application's requirements, such languages can be used to implement the application with far fewer lines of source code than you'd need with standard C.

      There are domains for which C is very well suited, and there are a lot more where it does a decent job. But it's not "least verbose" in general. Take a typical commercial APL application (yes, there still are some) and rewrite it in C, then see how many characters of source each one has. Or a typical non-trivial Ruby app, and so on.

  50. SheppeyMan

    Sadly missed. You can do virtually anything with C, and that is surely the point. Richie devised a language with a relatively simple instruction set which was also very powerful allowing programmers all over the world to utilise target hardware better.

  51. David Hager

    A summer in 84

    Ritchie remains engraved in my memory as one of the authors of "The C Programming Language," a first edition book through which I struggled as I learned to code in that "new" language. I was attempting to get a video camera and an 8088 based computer to "see" objects, and grew to appreciate the difficulty inherent to emulating even one narrow aspect of human perception.

    Still have the book.

  52. Dan 55 Silver badge

    Did Google post a link about Dennis Richie on their front page?

    I didn't see one. Odd considering that Google stood on his shoulders.

    C was a great language back in the day and still stands the test of time today. It's powerful and also allows you to shoot yourself in the foot, but it does so elegantly. Such side effects can't really be avoided in a language which assumes that the programmer knows what they're doing, that's why it's so powerful.

  53. Baudwalk


    Nothing against that other famous tech fellow who departed not long ago. But for this geek, Ritchie's departure is much sadder news.


    1. Sandtreader
      Thumb Up

      OK, you win the prize for best C fragment - encompassing both C and UNIX concepts in a wistful wrapper - nice one!

  54. Anonymous Coward
    Anonymous Coward

    printf("Goodbye, World!\n");

    Thank you for the workhorse...

  55. SEO in eGrove Systems


    This adventures express its deep impact of value.

  56. eulampios

    Thank you, dmr!

    Condolences to the family.

    Can't even imagine what kind of IT it would be without those bearded altruistic guys. (S. Jobs is not included due to the lack of altruism).

    What a beautiful life! Requiescat in pace!

  57. Anonymous Coward
    Anonymous Coward

    Another Obit

    My friend has a nice tribute here:

  58. Vendicar Decarian1

    ?You can do virtually anything with C"

    Except write efficient and readable, and relatively bug free code.

    It's one of the worst languages ever produced.

    1. Anonymous Coward

      @Vendicar Decarian1

  59. Sheira Gorris

    As a company that has trained over 3,000 graduates in UNIX and began its business by offering UNIX training, FDM Group has commemorated the life of Dennis Ritchie.

    Check out what our CEO has to say about the pioneering efforts of this inspirational man:

  60. Anonymous Coward
    Anonymous Coward

    Too soon?

    Has anybody double-checked that he hasn't become a zombie(1)?

This topic is closed for new posts.

Other stories you might like