
A true innovator
... and a real "genius". One that Michael Bloomberg probably won't have heard of.
Goodbye Dennis.
It was 1968 and students and workers were on the march, protesting against the Vietnam War, with the western world seemingly teetering on the brink of revolution. In the sleepy, leafy suburb of New Jersey's Murray Hill, a young maths and physics graduate was laying the groundwork for an entirely different revolution. For …
ATC staff don't know or care what language Air Traffic Control systems are written in. Just so long as they work.
Most bank staff don't know or care what technology underpins their daily working lives (and I can guarantee that the bank board doesn't either).
So why Michael Bloomberg should, I completely fail to understand.
Is/was he a coder before he was a politician? No. Its like saying Reuters managers know anything about the internals of the Reuters System.
Fail to see your point.
Along with the likes of Alan Turing and William Shockley, Dennis Ritchie was one of the founding fathers of modern computing, and I was saddened to learn of his passing. Without his genius and insight, we would not have home and office computing today. He will be sorely missed by the IT community. Condolences to his family, may the great man rest in peace.
These blokes were indeed heroes and great contributors to the industry and their input should be recognised.
It is a complete myth that the industry would not have happened without them. The truth is that there were many parallel threads of development and if Ritchie had never been born his work would still have been performed by others. C is an Algol-family language and there was a rich primordial soup of languages and OSs.
The same myth is encountered all over the place. If the Wright Bros had not invented powered flight we'd still be using ships to get from USA to Europe. Well no, there were many parallel threads working on powered flight. Wright Bros just scooped the glory. The same applies to much of Edison's work, Intel and the 4004 and so on.
What gave us Unix and C was not so much Ritchie as the legal mechanism that forced Bell to release the code to universities. If that had not happened we'd probably have ended up on a very different trajectory that Unix/C. Perhaps PASCAL/Modula2 based, perhaps something else and quite likely something without many of the pitfalls of C.
So if we really want to give thanks for C and Unix then perhaps - as much as it galls us - we should also thank the lawyers.
Everything anyone ever did depended on what those who went before built for them, but I was disgusted when Steven Spielberg described Steve Jobs as "the greatest inventor since Thomas Edison". He didn't wield a soldering iron or write code, he was a CEO who hired teams of highly skilled engineers to invent things for him. Dennis Ritchie at least actually built stuff with his own hands and therefore deserves that much more acknowledgement for his achievements.
"at least [Ritchie] actually built stuff with his own hands and therefore deserves that much more acknowledgement for his achievements."
Well if you're using that analogy then Mark Zuckerberg also deserves a lot of credit.
The difference between Ritchie and people like Jobs (or Zuckerberg) is how much they care about making the sound of their own voices heard. Some people like to just create great stuff and don't do it for glory or credit.
It's quite sad that many people don't even know who this guy was but also it's a compliment to how little he gave a shit about "marketing" himself!
I think it was Spielberg who a few weeks after 9/11 said the US should use it soft power and beam Seinfeld into the Yurts of the all Taliban in Kathmandu.
Shortly before 9/11 the most of Nepalese royal family was assassinated by one of there own number. The day before Spielberg's asinine suggestion the last remaining Nepalese princess died in a helicopter crash. At the time Mongolia had suffered a number of years of drought and very harsh winters, as a consequence most of the grazing stock died of starvation forcing the herdsmen to move into the towns & cities.
The Taliban don't live in Kathmandu, the capital of Nepal. The Taliban live in Afghanistan the capital of which is Kabul, nor do they live in Yurts. In fact no one in Kathmandu lives in a Yurt, nor in Afghanistan. But 1800 miles away, Mongolian herdsmen do live in Yurts when they're out on the steppes, except when all of their animals are dead.
So what the f**k would Spielberg know about anything.
God bless you Dennis Ritchie - I wanted to write you an obit, but seeing Spielbergs name in this context made me mad. We met once, in Boston I think - long time ago, ken was there too, not sure about Brian.
Rest in Peace brother - I'll always treasure what you and your colleagues Ken T & Brian gave to me and so many others
you couldn't be more wrong charles. dennis ritchie was a hero inventor twice over and you disgrace his memory.
yes, if it wasn't for the lawyers, unix and c might not have made it into academia just as cs departments were buying computers for themselves. however dennis's inventions -- as well as ken thompson's and the others at bell labs -- just would not have caught on if they weren't any good. unix and c prevailed because they were (and still are) light-years ahead of the alternatives.
those inventions are still going strong today ~40 years later. which is more than can be said for pascal and modula-2. c might not be as pure as those languages but when there's real work to do, c gets the job done time and time again. which is why it is so heavily used and the likes of modula-2 isn't.
most decent operating systems and programming languages in use today are a direct result of dennis ritchie's genius. many of those that aren't are heavily influenced by his work.
now maybe someone else might have invented unix. or c. but they didn't. dennis did. well, ken thompson shares the credit for unix. you would do well to remember that and be thankful for the wonderful platform they gave to the world.
This post has been deleted by its author
Most stuff made with c don't even run on windows, or even have an interface for that matter. people seam to forget that almost everything with a chip also has some programming done and burned in a ROM somewhere, and C is the most common language to write that stuff. C is #2 on TIOBE but i wonder if a more specific count would be done on these small gadget if it would not jump to # 1 by a whole order of magnitude.
it only encourages them to multiply, and we've got enough of the parasites already.
Second, whether or not someone else would have done something at the time is not what makes a leader a hero. The question is whether or not someone else could have done as well or better than they did it. When that leader consistently turns in exceptional results, they are indeed a hero, regardless of what else was percolating at the time.
I am no expert, but surely to say "Linus Torvalds announced his project of writing an open-source clone of Unix from scratch in 1991" is a bit misleading? A kernel, yes. But a great deal of the rest was GNU -- notably the C-compiler. I can see that Stallman has not made any friends in the past week, but that doesn't justify airbrushing GNU out of Unix history like Trotsky.
"Hello everybody out there using minix -
"I’m doing a (free) operating system (just a hobby, won’t be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since april, and is starting to get ready. I’d like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file-system (due to practical reasons)
among other things).
"I’ve currently ported bash(1.08) and gcc(1.40), and things seem to work.
This implies that I’ll get something practical within a few months, and
I’d like to know what features most people would want. Any suggestions
are welcome, but I won’t promise I’ll implement them :-)
"Linus (torvalds@kruuna.helsinki.fi)
"PS. Yes – it’s free of any minix code, and it has a multi-threaded fs.
It is NOT protable (uses 386 task switching etc), and it probably never
will support anything other than AT-harddisks, as that’s all I have :-(."
He did announce just that, and started with the kernel. After getting the kernel working, other people ported the GNU system to work on it - it had pretty much everything working except the Hurd kernel which still even today isn't considered ready for production use.
One of the reasons Ritchie is not more famous is because of Stallman's hatred for the Bell Labs group and the practical suppression of pre-Torvalds UNIX history. Remember, GNU stood for "GNU is not UNIX". They hated UNIX/C, because it came out of a corporate research lab, and most of all because eclipsed the TOPS-10/LISP culture that the MIT AI Lab dominated in those days.
Yes, part of UNIX success was that it was free and open source. But it was also elegantly simple. I worked at Bell Labs Research in the 1980s and knew Dennis and Ken. I heard horror stories about MULTICs, how it was so complex and inefficient it could only "time share" two users at once. Ken came off that project determined to do something completely different. Simplicity is hard to achieve, and a lot of thought went into UNIX and its (now long-lost) philosophy of composing simple tools to perform complex tasks.
There was a long rivalry between MIT and Bell Labs about this question of design simplicity. The hacker culture had a macho attitude of "my code is bigger than your code", while Ken and Dennis spent hours trying to boil things down to the fewest lines of code and the fewest necessary features.
I remember one attempt to reconcile. The UNIX team invited Stallman to visit Bell Labs. I don't recall much about Stallman's talk, but everyone remembers that he picked his nose and ate a bugger in front of everyone. In return, Rob Pike was sent to give a talk at MIT, which he was not able to deliver, because Stallman and his friends heckled him.
or at least cause a segfault for writing to RODATA. And a number of other nitpicks. But as it's an obituary, I'll leave them be.
One thing I will say, and that's that as so often, history needs its actors to act or something else entirely will happen. Anyone smart enough could've made similar improvements, but it needed someone to do it there and then. Besides, you never know in advance where you'll end up and in hindsight it often looks far more obvious than it did at the start. In fact, if it doesn't look obvious (to you) afterwards, you may be doing it wrong. Both because elegance tends to be inherent (to quote another great man, "everything should be as simple as possible, but no simpler") and because you usually want to stay down to earth and off to the next adventure. If everyone else falls over themselves exlaiming they could or could not have done it, well, that just people.
None of that changes that dmr left us with a few things very useful and yet he will be missed moreso for it.
It's sad to see another of that pioneering generation heading off to the great operating system in the sky. However, I take issue with one part of this obituary: C is most emphatically *not* a "high level" language! I don't think Dennis Ritchie himself would thank you for referring to his "portable assembly language" as such.
Also, the history of UNIX through the ages seems more than a little revisionary: BSD Unix (which really *is* UNIX) and the BSD-derived NeXTStep were already available by 1991, long before Torvalds' kernel made it out into the wild as an integral part of a viable operating system.
I'd also disagree violently that Linux did anything to spread UNIX: Linux is no more "UNIX" than a Compaq PC *clone* was an actual IBM PC.
Linux is a *UNIX-like* operating system, but it is not itself a bona fide version of UNIX and has, therefore, done more to *reduce* the prevalence of *reduce* the use of Thompson & Ritchie's UNIX operating system and its later releases than even Microsoft have managed. Apple's BSD-derived OS X has probably done far more for UNIX's popularity in the consumer space as BSD really is a direct descendant of T&R's UNIX and not a clone.
This is analogous to Compaq's reverse engineering of the original IBM PC's BIOS, opening up the market for compatible IBM PC *clones*. Linux is a clone of UNIX. It is a "UNIX-compatible" OS.
"BSD Unix (which really *is* UNIX)" - not in the most pedantic sense.
A lot depends on your definition. BSD (which was originally a series of add-ons and modifications published in source) became a full OS distribution and split from Bell Labs UNIX around version/edition 6/7, and was never re-integrated (although SVR2/3/4 all added BSD features back into to the AT&T sourced versions).
As a UNIX pedant, I would say that BSD is *NOT* UNIX. Remember the lawsuit that forced AT&T code to be removed from BSD, leading to BSD/Lite, FreeBSD and BSD/386, so it is difficult to justify the claim that BSD is UNIX.
By comparison, HP/UX, AIX, Xenix, UNIXWare/SCO UNIX, Altix, SINIX and many more were derivatives of AT&T code, and passed UNIX branding tests, so could legally be called UNIX.
I accept that a lot of people who were from outside the Bell Labs/AT&T world may well have seen BSD before any commercial version of UNIX, and may well have referred to BSD as UNIX before AT&T got commercially sensitive about the UNIX brand, but that does not alter the fact that it was a very early fork of UNIX which never gained UNIX branding. I am not arguing that BSD is no good, because clearly it is, but that it's claim to be UNIX is subject to interpretation.
As far as I am aware, the only BSD variant that passes any of the UNIX compliance test suites is OSX!
Or was it Thompson? Anyway, think it was Ritchie - Massey University, New Zealand, about 1985: He worked out that, per head of population it was the largest UNIX gathering he had attended. I asked him what, in his opinion, UNIX's biggest defect is, he answered, "No concurrency", while crouching to feed the ducks. Asked why UNIX commands, during the main Q and A session, had such cryptic names, he said it was to make sure they did not get confused with anything else. Very approachable and filled the image of beard, long hair and sandals exactly.
By the way, B may come from Bell labs., but it comes, actually, from BCPL, from University of Cambridge (England). The old question was, what would the successor to "C" be called, "D" or "P"? As for Algol, so casually dismissed: it was a decent language, affected Pascal, Ada, Modula and not without influence on C++ and Java. I believe Steven Bourne used Algol ideas, if not Algol, for his first Bourne shell implementation.
Before he went to Bell Labs, Steve Bourne got his PhD at Cambridge where he worked on an Algol68C compiler. The similarity of some constructs in the shell to those of Algol68 is not accidental! (Both case...esac and if...fi come from Algol68. Loops would have had do...od if "od" wasn't already a octal dump.)
"C paved the way for object-oriented programming ... Today, C and its descendent C++ are a popular choice for building operating systems"
That should be
"C was refitted with object-oriented programming features, yielding something not unlike a gaily decorated truck out of Lahore ... Today, C and its descendent C++ are a popular choice for shooting oneself in both feet trying to build operating systems"
You're joking, right?
Six flavours doesn't even come close, we had that many flavours just on one site - which was particularly odd as we only had machines from two vendors.
In fact the old Siemens-Nixdorf T35 actually had two flavours on one machine both available at the same time; you could just switch between the two with a simple command once logged in. (No, there was no virtualisation involved.)
Fun times.
God I'd clean forgot about Pyramid - what happened to them - I know I worked on one, money market system I think
Memory's long past are fading,
But in flesh or in spirit Dennis,
Ken & Brian always burn brightly
Everlasting into the long dark night
Thanks to Dennis R, respects to Ken T & Brian K.
"and oh by the way those aren't pointers to bytes, they're pointers to words"
I met this on a microcontroller, not too many moons ago. And lo, there was much wailing and gnashing of teeth, and darkness was on the face of the softies who had to deal with this f**king stupid idea.
As a genuine old-timer, I remember when C was first released, and the first thing I did was run out and by the K&R C books, and later the K&T "Programming in Unix" book. I used K&R C for programming various projects and learned a lot from it. Far more elegant than Assembler, although some wags Referred to C as "Portable PDP-11 Assembly Language." No matter, it was powerful, elegant and highly portable. C on an 8080 anyone? I used it.
The "cryptic" commands in Unix were strongly influenced by the fact the Teletype Model 33 was far and away the most common terminal connected to Unix systems. For those relative newbie, the 33 was an effective but a genuine PITA to type on. Also very slow,limited by its mechanical operational mechanism to 300 baud, max. Anything that could be done to lessen the amount of typing was a Really Good Thing hence the highly shortened commands made using Unix a whole lot more pleasant.
dmr truly impacted the world of computing and impacted that world far more than the greedy, megalomaniac control freak that started that company named after a fruit.
Requiescat In Pacem dmr and condolences to the family and the millions of users of your inventions. May your legacy and humility be remembered forever.
Really it did exactly what it was designed to be.
A Machine Independent Macro Assembler.
Hence the HUGE dependence on Makefiles and #define
C was a great Achievement in the climate of early 1970s
What did he do for the next 40 years though?
<<If one was to compare the recent losses in technology to certain fictional figures then I would not consider it to be unfair to say that.
Dennis Ritchie == Jesus>>
Actually you can argue if Jesus was a Deluded Man, or God, or his followers made up stuff. But no shortage of historical evidence that he was a real person.
> What did he do for the next 40 years though?
mage, you are showing your ignorance. after inventing unix and c, dennis ritchie co-wrote one of the major computing textbooks. he developed unix v8 which sadly wasn't allowed to leave bell labs because it was a threat to at&t's commercial but shit unix offering. he invented streams. which at&t turned into a train wreck for system v. he spent years getting the c language standardised. i think he was involved in the posix standards work too. he then went on to develop plan9 and inferno. if that wasn't enough, dennis ritchie lead the computer science research team at bell labs, one of the world's most prestigious research institutions.
btw, that michaelangelo bloke did fuck all after painting the sistine chapel. what more did he need to contribute?
Murray Hill is in New Jersey.
My high school was a few towns away and I had a teacher who worked in Bell labs during the summer.
She got us a tour (1971 - 72) of the labs and I had a chance to talk with a gentleman who showed us his little system he was working on (if IRC he said it supported 4 users). I still remember the PDP computer and Teletype 33 at the end of the table. Wish I had taken a few sheets out output!
The issue with Plan 9 is, it is way ahead of even today's common/general technology. You don't have anything (yet) to use the amazing features and the design of it. It is more like trying to use UNIX on Altair home computer.
Once the technology allows the vision of these guys, we will hear something like "Apple considering to move from Mach/UNIX to Plan 9 at OS level" and nobody will be that surprised, some will even say "about time".
I speak about stuff like seamless roaming between devices (including car computer) etc. and seamless distributed computing.
For example, IBM happily uses Plan 9 on their Blue Gene monster.
He was a titan, but don't go overboard: "General-purpose programming languages had not existed before C" is rubbish: Algol, and Fortran preceded it by more than a decade, and Multics (which dmr worked on) was written in PL/1 as were its applications.
C was / is a good implementation of the idea, based on learning from those experiences, as dmr himself would say.
In other words: there's no need to denigrate the giants upon whose shoulders he stood.
It's true enough that FORTRAN, COBOL, BASIC, and others pre-dated C, but C was different precisely because of the other exception (NO PUN:) taken to this very nice article: C is not quite a high-level language (HLL), but sort of straddles the ASM / HLL border. That's the good news -- and the bad. In the right hands it is a thing of beauty (which could be set of Unix as well). As a guy who taught the C class I once took (not at Uni; courtesy of an employer) : "It compiles itself." What he meant of course, is it's the sort of language you could easily write a compiler in. I wouldn't want to try that in any of the HLLs which preceded it, or those which came later. In fact, with a little help from the lex & yacc descendants, you almost wouldn't want to write a compiler in anything else.
Yes, Clarke's "General-purpose programming languages had not existed" is a particular howler. Assembler, after all, is a "general-purpose" language (it's not a domain-specific one) when running on general-purpose hardware. Among higher-level languages that were going strong when C was invented and are still in widespread use you have FORTRAN, COBOL, LISP, BASIC, and PL/I (in historical order); plus others that have fallen into disuse since, such as ALGOL and C's own ancestors BCPL and B. And contrary to another commentator, C's supposed "high-level assembler" nature doesn't excuse this wildly erroneous claim.
But that's far from the only false step in this piece.
It's not true that the major general-purpose languages were meant to "lock virgin customers [or any other kind] in"; in fact, C exposed more implementation details to the application, and so C programs often required more porting effort. C did not "go visual" in any meaningful sense with Microsoft's Visual Studio, anymore than it "went turbo" with Borland's Turbo C. Others have already noted how ridiculous the "at least six flavors of Unix" bit is.
I don't even want to start on "It is a high-level procedural language that used a compiler to access a machine's low-level memory and to execute, so it can span different platforms". There's enough wrong there for half a dozen posts. Nor does C contain some magic inherent power to make programs fast, as Clarke claims; and I have no idea what he means by "programming with C is also accessible".
I've written software in C (and for Unix) since the mid-80s, and I like them. Certainly dmr made some terrific contributions to the industry - even if you agree with the criticism a number of people have levied against the use of in-band signaling in C (nul-terminated strings, printf formats, etc). I've been in Usenet conversations with him, and he always came across as knowledgeable, intelligent, reasonable, and friendly. It's a shame he's gone, and I wish the Reg could have done a better job with his obituary.
Raise your hands if you are aware that the length value "n" given to "strncat" says how many characters to take from the source string, and says nothing about the size of the destination buffer. So, it does not directly protect against buffer overflows. At least that's the way it is described on Gnu/Linux, and that surprised me.
I don't get the Visual C reference. When I switched from RiscOS to Windows 3.1 back in the day, I had to leave Acorn's nicely designed visual tools behind and use the pathetic MS C Version 7. It was at least a couple of years before VC1.0 arrived, with tools that were less advanced than Acorn's.
And don't talk to me about IDEs on Unices, I tried writing for Motif on DEC's Ultrix at the same time and I believe that's when I started to lose my hair.
(Still writing portable C code that runs on any version of Windows as well as a bare metal embedded system (~400 SLOC today :)).
I think you're seeing the past through rose-tinted glasses here. The only visual tool of note from Acorn was !FormEd, which came after VC++1.0, and when I left the scene in 1996, six years later than you must have done, there was still nothing approaching a decent IDE for C development.
The C compiler sucked balls too.
Learned C and UNIX when I was at Data General. I was impressed by the apparent simplicity of the design (which is quite often a sign of carefully crafting) and its obvious power. I was lucky enough to have a co-worker who had been at MIT and returned waxing eloquently about how UNIX was the future.
All I needed to learn C was the K&R book. Later, I took a course in which I received the BSTJ reprint. Then I got a Sun workstation , discovered USENET and shortly thereafter, Linux. It was all "downhill" from there.
RIP, dmr.
//there seem to be an awful lot of coats here with a copy of K&R in the pocket...
I studied at the University of Newcastle upon Tyne with Dennis' sister, Lynne. The family connection meant that we got Dennis to come and give us a talk on mechanisms of inter-process communication in Plan 9. So the department booked a medium isized room, and it was too small, there were too few chairs, and people were leaning in from the corridors in order to hear Dennis put forward his ideas, which he did brilliantly -- full of humour, and clear enough for anyone to understand, the mark of a man who had totally mastered his subject.
I treasure a brief e-mail exchange I had with him, when I asked if the word "grep" owed anything to the Manx Gaelic word for a fish-hook, which it seemed to me would be appropriate. Sadly, it turns out not. But the great man was not too grand to answer a random question from an unknown postgrad.
..this pretty much includes the vast bulk of the present generation of living human beings, (certainly in the developed world)...
The spread of computers into business and their development into effectively personal household appliances may have happend without the release of C andUnix, but I believe that it would almost certainly have been a much slower and much more painful progression. The explosion of development that we saw in the late eighties and ealy nineties might still be waiting in the wings to happen....
Rest well Dennis Ritchie... we all owe you a great deal
I'm somewhat surprised that the obit made no mention of Brian W. Kernighan, Ritchie's co-author of "The C Programming Language" in 1978, which was the bible for C programmers in those days.
Now correct me if I'm wrong - I'm sure you will - but as far as I recall Solaris was Sun's version of BSD. SVR4 was an attempt to bring to gether the two main threads of Unix: BSD and Systen V. I don't think it really took off because Sun and HP carried on as before with Solaris and HP-UX.
Now just in case you think cloud computing is something new, back around 1990 I was using Sun workstations. The software was installed on each workstation but all the data were held on fileservers so you could log onto any workstation on the intranet and access your own environment. And of course we had email, ftp and usenet. Mosaic arrived later.
The news was -- and is -- still about Steve Jobs but I would venture to suggest that without the work of Richie and his ilk people like Mr. Jobs would be peddling used mainframes or cars or something.
I'm going to be an anti-pedant, BTW. I don't see much point in using strncat when you're copying or concatenating a string that's a constant. User input, that's a whole different game....
I'm very sorry and saddened to hear this news. If it wasn't for Dennis Ritchie, I wouldn't be a C/C++ programmer, like so many other programmers. I think its fair to say, that almost all programmers owe him a huge debt of gratitude. He greatly helped changed the way most of us work because his work influenced so many languages that followed his work.
Its also amazing to think of how much work and research has been done and derived from his pioneering work. Its mind blowing to think just about everyone on the planet has at some point been indirectly influenced by his work because so much technology has been influenced by his work.
This is very sad news. :(
while(1)
{
printf("Thank you Dennis Ritchie and RIP... :(\n");
}
I know that every second programer and their dogs and cats use C. But it is also true that every second security-related bug can be attributed to various regressions built into C, namely buffer overruns, bad pointers, most primitive heap memory management and strings without length information attached.
Compared to Assembler, C is progress, but compared to ALGOL68 I would call it a regression. One of these quick hacks which dominate the world because a major American corporation was behind it.
Those who have spent serious time with a Pascal or Ada compiler know what I mean - many errors can be caught by a proper type system and runtime checks which only cost a few percent of runtime overhead.
Unix, on the other hand, is a great invention and if anybody can create a useful version in Ada, I will definitely use it. The Unix command shell is orders of magnitude more powerful than the GUI-clickety-click way of controlling a system. To the expert, Unix commands are by no means cryptic, but elegant, concise and powerful.
RIP Dennis Ritchie.
The one in NY is on Manhattan Island, between 42nd & 30th, runs down from 5th Ave to the East River - if you know the Empire State Bldg is then you're over the street from Murray Hill. AFAIK Bell were never in the NYC Murray Hill.
However, Bell Labs were in West St in Greenwich Village - the building is now the the Westbeth Arts centre. Don't think the software group was ever at that site - some exchange kit was there.
I did some research into Bell when it morphed into Lucent and then got hoovered up by Alcatel.
The loss of Bell Labs was a disaster for mankind - it presaged the coming of the anti-science crowd that's now infecting the planet. How could a country let something like Bell disappear in a puff of smoke - criminal. I think it was Carter who dealt the coup de grace to AT&T, Nixon was a mere toreador.
I know there were good commercial reasons to break up AT&T, but Bell Labs should have been given a status like Livermore or Argonne :cry:
Bye Dennis.
I keep reading that C is the second most popular language, and some suggestion that the most popular is Java. Based on LOC, I guess. That seems a little unfair since C is one of the world's least verbose languages and used in situations which demand a small amount of code running extremely quickly and reliably.
Shouldn't the count be of number of instances of the software running? In which case, count a handful of instances of Tomcat for each corporate Java project and hundreds of millions for every embedded device, phone, TV, car dashboard, router, GPS (...) running a C-based RTOS or Linux.
And what are Java VMs written in, anyway?
Thanks for the link, that's really interesting. It appears they are counting people-popularity (number of engineers, courses etc.) rather than LOC, projects or whatever. Interesting to see how C# has stolen quite a bit of Java's fire, leaving C almost back at pole position.
But I wonder if this might favour old and university-course languages. For example, (who (uses-p `LISP)) any more??
OK, scrap that - here is the definition of the Tiobe metric:
http://www.tiobe.com/index.php/content/paperinfo/tpci/tpci_definition.htm
Basically counting search results on +"<language> programming"! Worthless, surely? What's the betting C will spike next month?
Well, here's my contribution to the index in roughly cronological order:
BBC Basic programming ARM Assembler programming C programming C++ programming Javascript programming
Unqualified generalizations like "C is one of the world's least verbose languages" are worthless, particularly in a context like measuring SLOC, since that generally - by definition - doesn't include runtimes. Many programming languages have huge, feature-rich runtimes available to them as part of the standard language spec, which isn't the case with C. In domains where those runtimes cover a large portion of the application's requirements, such languages can be used to implement the application with far fewer lines of source code than you'd need with standard C.
There are domains for which C is very well suited, and there are a lot more where it does a decent job. But it's not "least verbose" in general. Take a typical commercial APL application (yes, there still are some) and rewrite it in C, then see how many characters of source each one has. Or a typical non-trivial Ruby app, and so on.
Ritchie remains engraved in my memory as one of the authors of "The C Programming Language," a first edition book through which I struggled as I learned to code in that "new" language. I was attempting to get a video camera and an 8088 based computer to "see" objects, and grew to appreciate the difficulty inherent to emulating even one narrow aspect of human perception.
Still have the book.
I didn't see one. Odd considering that Google stood on his shoulders.
C was a great language back in the day and still stands the test of time today. It's powerful and also allows you to shoot yourself in the foot, but it does so elegantly. Such side effects can't really be avoided in a language which assumes that the programmer knows what they're doing, that's why it's so powerful.
As a company that has trained over 3,000 graduates in UNIX and began its business by offering UNIX training, FDM Group has commemorated the life of Dennis Ritchie.
Check out what our CEO has to say about the pioneering efforts of this inspirational man: http://www.fdmacademy.com/fdm-group-commemorates-unix-inventor-dennis-ritchie/