back to article The New C++: Lay down your guns, knives, and clubs

"The world is built on C++," Herb Sutter tells The Reg. Considering he's one of the language's chief stewards, we question his impartiality. But he does have a point. After more than 30 years – depending on whose numbers you swallow (these or these) – C++ remains one of computing's most popular programming languages, favored …


This topic is closed for new posts.
  1. Nick Gisburne

    I'd think again about that working name

    C++0x = C0x = Cocks. The future world will be built on Cocks, eh? If you include the 'P' sounds in ++ you've got some kind of stutterer's pox (cowpox or seapox). You're not even going to get away with its fully pronounced name - see plus plus 0x = see plus plus socks.

    As it's one step up from C++, I'd go for C3P, and then you might as well add the 0 on the end just for Star Wars lulz.

    1. LaeMing

      The name is ....

      "C++0x" - looks more like the output of Perl programmers playing 'golf' to me!

      1. LinkOfHyrule

        C++0x is illegal here!

        "C++0x" looks to me like the ASCII art representation of a depraved sex act involving barbed wire and as such is considered "extreme porn" under English law!

      2. Sureo

        C++0x ?

        I would have thought something simpler and more obvious: C+++

        1. Ian Yates



          1. annodomini2



          2. Frumious Bandersnatch


            This doesn't work syntactically. There aren't any rules for deciding on the relative precedence of ++C and C++. The fact that the language doesn't include such a rule is due to both (++C) and (C++) not being lvalues so they can't be pre/post-incremented and the question of relative precedence doesn't arise. The related C++++ doesn't work either, for the same reason. You could call it (C+=2,C-2) but it's hardly a very catchy name.

        2. John 62

          re: C+++?

          Not sure about a triple-plus operator.

          But I thought C++++ had already been done: C#

    2. Anonymous Coward


      Ok, everybody wants to associate the name, but what about naming it D, E, F, whatever? C was created as a significant evolution of B...

      Yes, me and my poor coding skills are leaving...

      (whoever failed 3 times to write a "Hello World" inside a text box after reading C for dummies shouldn't be allowed near a compiler, like me.)

  2. Mikel

    We do need new tools for multicore

    I don't know if this new C++ is what we need, but there has been a need this past five years for a toolset that approaches the bare metal like C, but leverages the many core direction our processors are moving in.

    Whatever the end product is, it should be able to handle heterogenous cores, because that's the direction platforms are moving in. Multiple different CPU cores, GPUs, hardware media decoders, varying forms of ALU, RNG's and PhysX units included in the silicon. The core system is getting revved these days far faster than before and with a lot of oddball things that didn't used to be considered part of the platform. The software tools are lagging these days, as dealing with this riotous mix always involves custom coding.

    It's incredibly complex. I think we'll be groping about for an easy answer for a long time, and in the interim kernel coders are going to earn their money. More power to 'em. Whatever the answer is, it ain't Java, nor C#, nor COBOL.

    1. Anonymous Coward
      Anonymous Coward


      because this post is:

      objective, critical, balanced and rational

      nicely done

    2. Anton Ivanov

      There used to be a tool

      I know I will get shot down for saying this, but the tools are there. They are called Ada, Erlang and Smalltalk.

      The problem is that the first is universally hated by everyone and the latter two have failed to gain enough ground outside Europe.

      1. Someone Else Silver badge
        Thumb Down

        Say What?

        Ada? The PL/1 of the 80's?!?

        You mean to say that someone threw in **even more** kitchen sink-ism to support such modern nuances as multicores into Ada, and we weren't informed?

        Shirley, you jest...

        1. Anonymous Coward

          kitchen sink-ism

          Nice! and entirely fitting where Ada is concerned (was? it can only be hoped...). Designed by committee (commisariat) and it showed. Did have some basic concurrency as you say but could you massage the rendez-vous into a realistic multi-core implementation once you've hit the runtime system hard enough? Discuss*.

          Then they shovelled OO into it, but I'd got out by then - C++ it was so sexy! That committee - Grady Booch did alright out of the DoD, no? - stole my youth, the bastards. But now I'm happy, stroking my Python with some Lua and giving the bits some deep down C love...

          Did I tell you about the years spent writing Lisp? These days, I just need to find the time to cut some serious Erlang.

          *I think that's a "no", probbly.

    3. Nigel 11

      Google's GO language?

      Take a look at Golang. Looks like the right idea set for multicore, albeit immature.

  3. Peter 70


    El Reg seems to have been running lots of stries about Go recently. D goes largely unmentioned, which is a shame (mind you, understandable, the team behind Go have been much more savvy about getting the word out about what is a slightly more targeted and thus easier to explain language).

    True. D invites ridicule with it's name, but version 2 does start to deliver: much like C++0X it's an attempt to adress the short comings of C++ (the difficulty of programming well) and, of course, concurrence, while retaining the performance and ultimate flexibility of a 'big' language as opposed to a Ruby etc.

    Andrei Alexandrescu's book on D is worth a read - if only for its propensity to discuss the motivation behind the design of a language (he's been a driving force behind the version 2 revision of the language).

    1. Tasogare


      I'm glad someone mentioned this. I've been experimenting with D recently and, while it seems fairly obscure, it's surprisingly good at what it claims to do. I keep waiting for it to bite me and it doesn't.

    2. h4rm0ny
      Thumb Up


      I'm glad to see that other D supporters are out there. The moment I saw a headline about an improved C++ I wanted to grab my monitor and shout "It's here! It's called D!"

      D has addressed almost everything I could want it to address about C++. It's a great language that deserves to become mainstream. And I'll echo what you say about Andrei Alexandrescu's book on D. It's the most entertaining and interesting books on a programming language I've ever read. I've got it on Kindle and for a while, it went everywhere with me being just as interesting to read (moreso actually) than the novels I usually carry around for my leisure reading.

      If the world of programming really is a meritocracy, then D deserves to be out there and popular.

      1. Ru

        "If the world of programming really is a meritocracy..."


        From the article, "PHP, Ruby, and JavaScript? Sure, some might claim that they are the future..."

        Read it and weep. There is very, very little that is meritocratic about software development and tools. It is all accessibility and popularity with the usual dose of not-invented-here, lets-reinvent-the-wheel from each generation of new devs. There is no thought of whether there are better tools and languages out there already.

        But don't take my world for it, I like Perl and Erlang.

        1. ian 22


          Very positive comment. For balance I'll add C++ is putrid, and I hate it.

  4. Michael 17

    About *&#! time!

    Every time I have to listen to the annoying fanboys praise C++, I have to grit my teeth to avoid strangling them! I write low level lock-free concurrent algorithms for a living, Getting them right in C++ has always been much more of a pain in the ass than in Java, and the main reason is the lack of a clear memory model. I hope when they claim they're defining a C++ memory model that they're also including a proper suite of memory barriers and the like. I'm still bitter that the last C++ standards committee discussed the weakness of the volatile keyword, but then decided to punt the issue into the future.

    Now we'll just have to wait 5+ years for the C++ compilers to actually do the right thing when you have code that really depends on the memory model.

    1. Bartosz

      Memory model

      @Michael 17: I think you'll be happy with the new C++ memory model. If all you need is Java's "volatile" and sequential consistency, the C++ equivalent is the strong form of "atomic". But (weak) atomics can give you much more control than Java over lock-free semantics. You can portably do things that could only be done using "unsafe" code in Java. You get tools to relax sequential consistency without running into undefined behavior. Obviously, you can shoot yourself in the foot doing that. But that's C++.

    2. Tchou

      memory model?

      I'd'love to hear you speaking of Java memory model, and how you can control it.

      If you want clear model use C. How does a sky high abstracted language can help grasping the memory model?

      1. Michael 17

        Specified behavior is better than unspecified behavior

        A "sky high abstracted language" helps by actually having a defined memory model. C++ does not. Regardless of the underlying hardware provided memory model, the C++ compilers can do a variety of memory reorderings. The only control you have over that is with the volatile keyword, and the prior C++ standards committee deliberately bailed out of providing better memory semantics than volatile provides. Different C++ compilers end up using different mechanisms for saying you need a memory barrier, but there's no portable way to do that.

        So if you're content to write C++ code targeted at (modern) x86 CPUs compiled with gcc, you can do it. But if you'd like to write a portable library, you're screwed.

        A little searching will turn up plenty of web pages with long discussions of the issues here.

  5. Christian Berger

    One solution to get more speed

    Get rid of the copy constructor. Please! I know it sounds like a good idea on paper, but people just aren't going to use pointers when they have to put funny characters around them.

    The point is, it's the idiots who need help, not the experts. Nobody who knows what they are doing will ever complain for having to put some extra code into some exotic or dangerous program part. However requiring the idiot to learn C and Assembler, as well as all the implementation details of that platform, just to avoid the biggest pitfalls, isn't going to work.

    My point is: Unlike what many C++ programmers seem to believe, it is possible to have systems which are both easy to use/program and powerful and fast.

    1. Shakje

      I'm going to assume two things

      1) That you've been following 0x


      2) It hasn't changed significantly since I last looked (which was a bit of time ago)


      1) Pointers, bleh, there's no reason whatsoever (that I can think of) not to be using a combination of references and smart pointers (shared_ptr is in the MS TR1 implementation now, if you don't want that you can get the whole shebang with Boost)

      2) 0x brings move semantics, i.e. instead of copying the data, you transfer ownership of it to another object, have a look at it if you haven't already

      3) It's perfectly possible to write platform-independent code without writing all the gubbins that go with it by using Boost and a bit of common sense. If you want to optimise it then go ahead, but only if you're advanced.

      The really unfortunate thing about C++ is that backwards-compatibility thing. Unfortunately, it's too late now to start making things obsolete or unavailable, and there's not much can be done about that. Too many teachers don't talk properly about pointers and references, or RAII, or the pitfalls of constructors and the way inheritance works, and so on, and not enough companies are willing to enforce decent standards to control their C++.

      I guess what I'm trying to say is that the language could do with a clean-out, but that will probably never happen. I'll have a look at D, Alexandrescu is one of my favourite C++ writers, Modern C++ Design is one of my most worn-out C++ books.

    2. Field Marshal Von Krakenfart


      "Unlike what many C++ programmers seem to believe, it is possible to have systems which are both easy to use/program and powerful and fast."

      Yeah, it's called cobol, and on a mainframe

  6. jake Silver badge

    Whatever. When doing real work, I'll stick with K&R C and inline assembler ...

    I don't want the compiler to tell me what I can & can't do with the hardware.

    But then, I'm a hacker[1], not a sheep.

    [1] Old jargon meaning, not modern journalistic meaning ...

    1. Anonymous Coward
      Anonymous Coward

      the usual trite narrow-minded drivel

      "When doing real work, I'll stick with K&R C and inline assembler"

      Define real work. It may be that you are only confident with the tools you have already used. It may be that the "real work" you do is all about solving small problems that do not require any degree of abstraction.

      "I don't want the compiler to tell me what I can & can't do with the hardware". And if you want to program close to hardware, you are on the right track. Can you conceive of a problem space where you need to think bigger to solve a more complex problem? If not then you should stick with the mucking out.

      "not a sheep." Arrogant, narrow-minded and, here's an irony, unintelligent.

      1. jake Silver badge


        "Define real work."

        Coding close to the hardware, which should be obvious from the comment your replied to ... but for some reason you seem to think I'm not allowed to comment on that ... I mean, "stick the mucking out"? WTF?

        "Can you conceive of a problem space where you need to think bigger to solve a more complex problem?"

        Yes, I can. Are you equipped to understand my commentary?


        I prefer "capable".


        I prefer "knowledgeable".

        "and, here's an irony"

        God is an iron ...


        Says the AC, with no actual rebuttal to my post...

        1. foo_bar_baz

          Nerd testosterone alert

          The AC probably had an issue with your implication that:

          - other than close-to-hardware programming is not "real work"

          - using a HLL makes you a sheep

          Both of which rubbed the AC the wrong way, who's obviously proud of his own "real work" that involves using a compiler. I don't think he disagreed with your choice of tools, so he wasn't trying to offer a rebuttal.

          Just calm down and get back to your workstations. The managers want that code yesterday.

          1. jake Silver badge


            Most programming that isn't close to the hardware is frivolous. Think about it.

            It's not the HLL that makes you a sheep, it's the mind-set (see "app stores" and the like, and what they mean in the great scheme of things).

            I lost interest in managers over thirty years ago ... I'm my own boss :-)

  7. Paul Fremantle

    C++ Garbage Collection

    There is a good reason they still haven't added Garbage Collection to C++......... there would be nothing left. Boom Boom.

    1. sT0rNG b4R3 duRiD

      None left?

      Ah, no... You'll be left with C :P

  8. Anonymous Coward

    Mac OS X written in C++?

    More like C and Objective C. A very nice subset/superset pair of languages I might add, and more or less the antithesis of C++. (I like to think of C++ as the perfect language for writing an OS kernel using the OO paradigm. Except you don't need OO for kernels. You need it for applications. And applications are much better served by the dynamism of Objective C.)

    1. Steen Hive


      XNU is a hybrid microkernel The entire Mac OS X device-driver subsystem and accompanying user clients are written in C++, and very bloody good it is too.

  9. Mage Silver badge


    In 1987 I argued that it was a flaw not supporting Concurrency and that the language had learnt nothing about simple interfaces to building large systems from Modula-2. Modula-2 and Occam are older and both have concurrency support as part of the language.

    Since the 1970s we knew a new language would need Concurrency, Multi-cpu support etc.

    Also large C++ projects still too much depend on tools like Make. You should not need a separate file in a scripting language to build a large project.

    Strings and Buffer Over-runs. Most Security Vulnerabilities and bugs.

    Again by late 1970s it was known the default should be array bound checking. At outset of C++ in late 1980s the shortcomings of Strings in C was known and half baked string classes produced. It should have been a bit like VBSTR, not just null terminated but a size stored as well if dynamic, if Static the compiler knows the size. Then "count" to return the max number of array elements (string or otherwise) and "length" for size of array (or characters in string before null terminator).

    C++ was born deformed due to having to be as backward compatible as possible with C, which after all wasn't really developed as a HLL at all but as a portable macro assembler to port UNIX and UNIX applications. Any form of Template, macro or define has no place in a safe language with human readable programs.

    Obviously any native compiled language that's got a decent compiler is going to beat "p-code on Virtual Machines" (VB6, .Net, Java) or scripting languages. So no surprise C++ wins that competition.

    1. Anonymous Coward
      Anonymous Coward

      What C++ templates have to do with C backward compatibility?

      It is a star feature if you compare it with Java generics...

    2. Paul Shirley

      the build process

      "You should not need a separate file in a scripting language to build a large project."

      *Every* large C++ project I've ever worked on contained a large number of external resources that needed format conversion, precompiling, packing or other processing and the final output needed some sort of packaging or resource binding. Why on earth would tightly binding that to the C++ component make any sense at all?

      A tool like make is a pretty good solution to an opened ended problem like the build process, where there's no way of even knowing what tools a dev might need to run during the build.

      Decoupling the build *process* from any of the tools invoked while building is exactly the right solution. Scripted solutions like make are one of the more powerful options.

  10. Anteaus

    {[Time] (to) {ditch {C syntax}}} ? ;

    C, C++, C#, Java, Javascript, PERL, php etc. are all fundamentally similar in using a punctuation-heavy syntax as opposed to an English-like syntax. Worse, the punctuation symbols most often used are those which most typists find hard to touch-type owing to their placement on the keyboard. (and let's face it they were placed where they are because they are rarely used in standard typing)

    Apart from leading to slow, finger-twisting typing this heavy reliance on punctuation makes it hard to spot errors. A missing } can lead-to all kinds of weird error messages, most of which bear no relation to line with the actual error.

    While English-like syntaxes involve more characters per statement, this is easily offset by the fact that they can be typed much faster, and debugged more reliably.

    The choice of C syntax came about in the early days of the PC, through the need for a language which would deliver satisfactory perfromance on the puny 8086 processors of the day, but which would offer easier coding than pure assembly language. The fact that we're still using this syntax in the days of phones with GHz processors and PCs with more power than an early mainframe is historical, and somewhat farcical. It's akin to insisting on using CGA graphics on a 24" LCD.

    If as the article says there's a need for a rethink of programming languages to properly handle concurrent processing, then this may also be the point to stop using the archaic C syntax.

    1. Anonymous Coward
      Anonymous Coward

      "choice of C syntax came about in the early days of the PC"

      What ??!!

    2. Destroy All Monsters Silver badge

      U complain about the syntax?

      This is 2011. There are IDEs and check-as-you-type syntax verifiers, you know.

      And speed-typing when coding is really a non-issue.

      "as opposed to an English-like syntax."

      Because SQL was a big success in clarity?

      1. Andrew Hodgkinson

        Well, not SQL perhaps...

        ...but Ruby is really rather nice.

    3. Christopher Key.


      I take issue with almost all of the above!

      On ease of typing, I fail to see how typing "}" can possibly be slower than typing e.g. "End If", especially when including wasted time relocating the cursor to correct typos.

      On spotting errors, I fail to see how a missing "}" can be easier to spot than a missing "End If". The eye automatically matches bracket pairs, whereas matching "If", "End If" is far harder.

      On the comparison with English, note that English can be pretty punctuation heavy when conveying complex concepts, and all the better for it. Compare the following:

      > "When you wake up in the morning, Pooh," said Piglet at last, "what's the first thing you say to yourself?"

      "What's for breakfast?" said Pooh. "What do you say, Piglet?"

      "I say, I wonder what's going to happen exciting today?" said Piglet.

      Pooh nodded thoughtfully. "It's the same thing," he said.

      > Open Quote When you wake up in the morning Comma Pooh Comma Close Quote said Piglet at last Comma Open Quote what's the first thing you say to yourself Question Close Quote

      Open Quote What's for breakfast Question Close Quote said Pooh Stop Open Quote What do you say Comma Piglet Question Close Quote

      Open Quote I say Comma I wonder what's going to happen exciting today Question Close Quote said Piglet Stop

      Pooh nodded thoughtfully Stop Open Quote It's the same thing Comma Close Quote he said Stop

      1. foo_bar_baz

        @Christopher key

        Perhaps the answer is not being verbose. I offer Python as a good example.

        As a Perl fan the indentation put me off, but I had to use it in my work. I've since ditched Perl. On the rare occasions when I have to write JavaScript the braces and parentheses look very cluttered.

        Tux because penguins and snakes go together well.

    4. Anonymous Coward
      Anonymous Coward

      Here come the S-expressions! All for it.

      And functional programming solved concurrency years ago. ;P

      1. Anonymous Coward
        Anonymous Coward

        Re: Here come the S-expressions! All for it.

        "And functional programming solved concurrency years ago. ;P"

        Shame they make everything else harder (thinks of Haskell and shudders).

    5. Anonymous Coward

      Re: {[Time] (to) {ditch {C syntax}}} ? ;

      (indeed (time-to (use (proper (and syntax language)))))

      'nuff said.

    6. Bartosz

      C++ syntax

      @Anteaus: The use of punctuation might be annoying for beginners, but you can get used to it and then it actually saves you typing and makes programs less verbose (and that's important, because you don't want the mechanics of the language obscure the logic of your program). However there is one big handicap that C++ inherited from C and that is declaration syntax. Even Herb Sutter openly complains about it. But changing the syntax now would break so much existing code that it's not a realistic option.

    7. PotNoodle

      Re: syntax

      My only bugbear with C++ syntax is is really more to do with fonts - if in my stupidity I've put something in normal brackets instead of curly, the two look quite similar in some default typefaces (such as the one Code::Blocks uses by default), especially when rendered in 10-point. This can lead to a lot of confused squinting before I realise what's up. Easily solvable with a little customisation, though.

      I suspect the syntax could be simplified a bit if the committee felt it necessary. There's already a degree of context-sensitivity towards what each punctuation mark means - e.g. when I type a semicolon in the conditional block of a for loop, the compiler knows it doesn't mean "this is the end of the function". Personally I like using the current syntax - having plenty of different punctuation marks in play makes the code more readable once you understand what they mean. Plenty of other syntaxes (syntaces?) could use a little more variety in this area.

    8. Blitterbug


      Rubbish attempt at trolling, as nearly everything in this post is total bollox. History fail, much?

    9. Greg J Preece

      Proper language?

      I thought most people hated COBOL.

      1. Field Marshal Von Krakenfart

        @Greg Preece

        No, it’s just that university lecturers don't get tenure lecturing about Cobol and databases of bank accounts, its more trendy to use phrases like dangling pointers, after all if you can't dazzle them with brilliance, baffle them with bullshit. How can you impress anybody with:-


        when you can write shit like this:-

        for(;P("\n"),R-;P("|"))for(e=C;e-;P("_"+(*u++/8)%2))P("| "+(*u/4)%2);

        also see discussion about punctuation

        C code fragment deliberately chosen, do I have to explain it as well?

        Bring back the old icons, web 2.0, its made from badgers paws!

        1. Someone Else Silver badge


          No, you don't have to explain the code; you have to explain why there is code like this available to you to show here. Seems to me that an "NCIS greeting"* needs to be applied to whoever wrote that, **and** to whoever approved that WSH for production.

          And I can certainly write obfuscated COBOL, too. Besides the obvious:


          are you going to try to justify that goofball mask syntax nonsense, and the required but utterly useless SECTIONs that contain nothing but boilerplate text that was written 30 years ago just to pass a code review (assuming they even had code reviews back then)?

          *For those of you on the east side of the pond who have never seen the telly series NCIS, an "NCIS greeting" is a provoked or unprovoked slap on the backside of the head.

    10. Mark 65


      If you use an IDE and something like ReSharper (or equivalent for your language of choice) then you need not worry that much. I can only see it being a problem these days if you want to dick around in a text editor.

    11. Hi Wreck

      {[Time] (to) {ditch {C syntax}}} ? ;

      "The choice of C syntax came about in the early days of the PC, through the need for a language which would deliver satisfactory perfromance on the puny 8086 processors of the day, but which would offer easier coding than pure assembly language. "

      Hardly. Try PDP-8 and -11, both of which pre-date the 8080, let alone the 8086, quite substantially. Verbosity was NOT a feature of the language since VT-100 terminals at 9600 baud were all the rage. Real programmers (TM) hate to type.

      1. Paul Shirley

        Real programmers (TM) hate to type.

        ...and we also love to see the whole function on screen readably formatted for 'eyeballing'. Not have some verbose syntax force it to overflow the screen on either axis.

        1. sabroni Silver badge


          apologies for lack of formatting in comments, when I see this:





          I don't know what it means.

          but this:

          end if

          next Counter

          end function

          end class

          I can read and understand. This is why I think verbose languages are better than c style syntax.

          1. Tyson Key
            Thumb Up

            Re: Accolades/Curly Braces

            Just use a text editor/IDE that colour codes each pair - that way you won't get lost in a sea of them. :)

            1. sabroni Silver badge

              Doesn't really address the issue..

              .. "next counter" or "end if" are still much clearer than "green }".

              I'm not saying that c style syntax is unusable (spent the last year working mainly in javascript), but from a human perspective I find words easier to read than punctuation or colours.

              1. Anonymous Coward
                Anonymous Coward

                depends on your code

                "next counter" or "end if" are still much clearer than "green }".

                if your code is concise, and well laid out, your brackets will be indented to the same levels. As your code is good, they also aren't particularly far apart, letting you match them quickly and easily

                You only need verbose explanations if your constructions are so big (like if you have to write everything out longhand) that you get huge walls of text. If you can't see the opening and closing of your brackets on the same screen.

                a) you've done something wrong, you probably should think about a better way of constructing it.

                b) accepting that, just add short a comment if you need it " } //end xxx counter

                c) as you've commented your code well anyway, you should be able to understand what it's doing from those.

                The c++ syntax is great at immediately showing you what each 'bit' does, you know immediately from sight what's a function, what's a variable, operators etc, without having to read through a load of text and think about what each bit actually means.

                1. sabroni Silver badge

                  at anonymous pussy, 20:34

                  Ever work with other engineers who aren't as perfect as you?

                  But regarding your points, a) if it works and is done to schedule then it's right. I could refactor my code endlessly making it prettier and prettier if I didn't have work to do.

                  b) And adding a comment saying "end xxx" to a "}" doesn't mean that's what that curly is doing, unlike a verbose language where typing "end if" where you mean "end function" will give you an error. That's clearly better.

                  c) well written verbose code needs few comments, the language is readable.

                  Your point seems to be "I know c++ really well and can read it quickly". Well great for you. No really, well done. I can already read English so find it quicker to read languages based on that. I'm not saying you're wrong to like C++, I'm sure well written C++ can be lovely to read, but you can't really argue that "}" carries as much meaning as the many end statements it replaces in other languages.

            2. Frumious Bandersnatch

              paren matching

              Simple in emacs:

              M-x blink-matching-open

              C-1 M-x show-paren-mode

              The show-paren mode is usually turned on by default for languages that emacs knows about--and it knows how to handle many languages.

              Most programmers' editors do something similar. Plus most of them have intelligent tab handling to bring you to an appropriate indent level and will also align closing brackets of all kinds with the corresponding open bracket or whatever block marker the language uses. Personally, I prefer brackets over verbose if / endif style languages since you can see an awful lot more code on screen at once and it makes it actually easier to follow what the code is doing without having to scroll up and down.

    12. Robinson

      Slightly incorrect history here...

      "The choice of C syntax came about in the early days of the PC, through the need for a language which would deliver satisfactory perfromance on the puny 8086 processors of the day,"

      Err no, it didn't. C syntax came from K&R, who got it from Algol-60. They were all described and in use before the 8086 was created and, for all I know, before Intel even existed.

    13. Charles Manning

      All specialised languages use special syntax

      Pretty much every specialist language has an archaic syntax. Look at mathematics, music, chemical formulae.

      Writing English does not make it easier. If anything it makes things way harder to read, just as it would be harder to play music written in English instead.

      Some people think that Python is cool because it has no braces etc. But in reality python still has a syntax - based on indenting. This is far worse than braces etc because you can't see them. Get spaces and tabs wrong and you can screw up a program.

  11. Gordon 10

    More importantly

    What has been done to make the language more secure? Bad C code is responsible for a huge amount of the vulnerabilities found in OSes and their apps.

    Whilst it's unfair to blame c++ directly there should be a huge focus on blocking the paths programmers can take that introduce such issues.

    1. Ken Hagan Gold badge

      Re: Bad C code

      Umm, C is not C++.

      Half-decent C++ code hardly ever does explicit memory allocation and uses RAII to correctly dispose of that and all other resources and so not only does it not need GC, GC-ed languages end up needing RAII for all non-memory resource types. Buffer overruns don't happen if you are using the standard library's collection classes, and the only common reason for using raw arrays is because you are interfacing with code that requires them, such as your OS APIs if you haven't elected to use a toolkit.

      The necessary features have been part of the old C++ for the last 15-20 years, so I'm afraid the new C++ has added almost nothing to help with your problem.

      Perhaps the real problem is that the new C++ still accepts C-style programming and therefore tolerates C-style programmers. (If you are programming in a "safe" language, you can't cut-n-paste some crap C code that you found on the web. If you are programming in C++, you probably can.) Or perhaps the problem is program managers who don't use *any* static analysis tools (not even a decent compiler warning level) on the code that their sub-ordinates check in.

      Or perhaps it is just true that native-mode programs tend to fail fast and noisily (crash), whereas managed ones are quietly kept alive in an indeterminate state. Amusingly, some people call the latter "recovery". Perhaps this is because "recovery" is what they will have to do to their data files next week, once they see what the program (now in an indeterminate state) went on to do to them.

    2. Robinson

      Well, the tools are there already...

      "Whilst it's unfair to blame c++ directly there should be a huge focus on blocking the paths programmers can take that introduce such issues"

      This is down to choice. For example, I use dependency injection and smart pointers everywhere. Apart from when I have to send something into an OS function or get something out of one. I don't use raw strings or raw arrays (prefer std:: and boost for all my collection needs).

      I did have a bug recently that I just couldn't find - but it was due to how I was handling fetching and storing a resource (converting a resource pointer from Windows to a std::string). I was totally blind to it because it never threw an exception in the debugger, or when run from the IDE in release mode even though it was running over memory with reckless abandon. It only threw when run outside of VS (!). Luckily I have colleagues with larger brains, so it was found and corrected in the end.

  12. Anonymous Coward
    Anonymous Coward

    Up to a point, Lord Copper...

    It's always amusing to notice the unstated assumptions built in to presentations like Sutter's. He says that the world is built on C++, but that's only the world of PCs really. Computing as entertainment, one might call it (to be slightly cynical). Meanwhile - you guessed it - the real world of industry and commerce does most of its sums on COBOL and CICS, as it has for the past 40 years or so. Including (ironically enough) the money mountains of Sutter and his pals at Microsoft. Ask any bank (I was going to say "any reliable, reputable bank" but let's not get into that right now) whether its core systems are written in C++ and you'll get a resounding "No".

    Meanwhile safety-critical avionics and the like continue to be written in languages like Ada, which - while too boring for fashion-conscious developers - make it possible to write much more reliable and predictable software.

    Lastly, please note that it's not helpful to tell us that C++ is faster at runtime than Java. What would be helpful would be to know how much faster it is, and what the tradeoffs are.

    1. Hungry Sean

      speed of languages is multifaceted

      I've noticed a number of articles on el reg in the last few weeks talking about the relative speed of different languages. What seems to be lost is the concept of speed of correct implementation. For a huge number of applications, execution time doesn't matter as long as it is under about 33ms (end user can't tell the difference) and as many as a few seconds is tolerable. This is where a language like perl with its easy support of hashes, dynamic typing, etc. shines. The total time to being able to get the job done is reduced drastically. At the point when performance becomes important, it is still valuable to have an implementation for validation which can be trusted to be correct-- generally this means concise code built on top of a well smoked library (e.g. matlab).

      Execution speed can be hugely important, but often it is a later concern (see facebook starting with PHP and then moving to C++). As a fan of both C and perl (not C++), 90% of stuff is quicker to get done reasonably with perl, and time 'til done is a damn important metric.

      1. Mark 65

        speed of language

        I believe that's one of the reasons Python is getting so popular - the mix between capability and speed.

    2. fortyrunner

      Well put

      The retail banking world is built on COBOL, the investment banking world is built on Java and C#, the gaming world is built on C++, the web world is built on everything.

      C++ is needed but its more irrelevant every year to a lot of programmers who struggle with its complexity. If I were to suggest writing a new project in C++ I would be looked on as mad.

      1. AndyDent

        new projects in C++, yup!

        I just reviewed the options for coding a cross-platform GUI desktop app for OS/X and Windows with nine different programmers and the most common recommendation was C++ for core logic with native UI in Objective-C/Cocoa and C#/WPF.

      2. Burkhard Kloss

        Underneath all the shiny C# and Java projects... investment banking you'll probably still find a C++ quant library.

    3. Someone Else Silver badge

      It is to laugh, Tom

      "Meanwhile - you guessed it - the real world of industry and commerce does most of its sums on COBOL and CICS, as it has for the past 40 years or so."

      Really? You mean to say that the trading programs that are in charge of running (and some...including myself...would say mismanaging, stacking the deck, running into the ground, etc.) the entirety of world markets are written in COBOL?!? Jeez, then I guess all those advertisements soliciting experienced C++ prgrammers to write that code are lying.

      Who knew?

  13. Anonymous Coward
    Anonymous Coward

    Built on C++, but at what cost?

    The C language and its mutated spawn must rank as one of the worst viruses to hit the world of computing.

    Don't get me wrong, C is a highly efficient systems implementation language (SIL), but like all of the SILs it is highly dangerous precisely because of its efficiency. When safely confined in a lab (in this case, Bell Labs) it was a powerful tool for good, but it inevitably escaped and infected the world of commercial computing. C++ was an attempt to attenuate the virus and make something less dangerous for application use, but in order to achieve more or less full backward compatibility it did nothing to address the underlying highly efficient but totally unsafe memory architecture.

    Just as a "for instance" to demonstrate how dangerous (and ubiquitous) the C memory architecture is, ever heard the phrase "unchecked buffer vulnerability"? The vast majority of the 300,000+ Google hits are probably its handiwork. Well, of course they are really the handiwork of a programmer, but therein lies the problem. Safe C code is bigger, slower and can take longer to write - how many more disincentives do you want? This is why Intel, Microsoft and others have had to implement mitigation techniques such as data execution protection, address space layout randomization, structured exception handler overwrite protection, mandatory integrity control, etc. etc. etc.

    So will we ever see an end to this outbreak? Probably not in the foreseeable future. Abstracting the memory architecture into something inherently safer (à la .NET and others) is one way to go, but ultimately the runtime systems will have been written in C++ and so are vulnerable.

    1. Anonymous Coward
      Anonymous Coward

      Why Intel had to implement mitigation techniques

      @Thoguht: Built on C++, but at what cost? #

      > Just as a "for instance" to demonstrate how dangerous (and ubiquitous) the C memory architecture is, ever heard the phrase "unchecked buffer vulnerability"? .. This is why Intel, Microsoft and others have had to implement mitigation techniques ...

      These buffer vulnerabilities are caused by a defect in the memory management unit of the underlying Intel architecture and such mitigation techniques belongs in the hardware not some system library.

      1. Anonymous Coward
        Anonymous Coward


        It's nothing to do with Intel at all. Most processors and their accompanying operating systems since the year dot have adopted the stack+heap paradigm. The problem with C/C++ is that the language places this fragile paradigm at the direct mercy of the programmer.

        1. Anonymous Coward
          Anonymous Coward

          That seems to be a function...

          of whether the programmer knows how to use the language. It may be more difficult to do correctly with C/C++ but that doesn't mean it shouldn't be be done with C/C++ if that will make a program more efficient. This, obviously, depends on the use of the program; server side C/C++ may be the way to go while HLL for desktops. There are definitely a lot of people who call themselves programmers because they are interested in programming. Unfortunately, interest and ability are not always the same thing. I studied C/C++ years ago, but am no programmer. Fun to poke around though.

          1. Anonymous Coward
            Anonymous Coward

            No programmer?

            That's where we differ - I've been a professional C/C++ programmer for over 30 years, so I do know a thing or two about it.

            1. Anonymous Coward
              Anonymous Coward

              Thanks for replying...

              and I will defer to your experience. Must say I'm sorry I never worked in C/C++ or any other programming shop. I enjoyed those studies but jobs and timing can us in unexpected directions. The timing of this article was fun as I've been thinking I'd tinker with them again, maybe on Fedora or OBSD.

  14. Anonymous Coward
    Anonymous Coward

    Not so

    ""Apple's Mac OS X, Adobe Illustrator, Facebook, Google's Chrome browser, the Apache MapReduce clustered data-processing architecture, Microsoft Windows 7 and Internet Explorer, Firefox, and MySQL – to name just a handful – are written in part or in their entirety with C++""

    Can't be sure about all of the above but as has been mentioned OS X is mainly C and assembler like most OS's (e.g Windows and Linux kernels) -- sure there are C++ items in there but mainly all heavy-lifting is C and assembler. Same as Phototshop -- lots and lots of in-line assembler. MySQL is almost all C with some C++.

    Windows: C++, kernel is in C

    Mac: Objective C, kernel is in C (IO PnP subsystem is Embedded C++)

    Linux: Most things are in C, many user apps are in Python (KDE is all C++ I think)

    C and OC ain't C++.

  15. Herby

    Language syntax??

    Look, if we didn't have the C-like syntax, we would be stuck with something like Fortran or (god forbid) Cobol (try that for a systems language!). Even Algol-60 is bad, and try to implement Algol-68 (has anybody really done this?). Sorry the C-like syntax is here to stay, or else you get "significant white space".

    As for C++, why bother. I defer to someone more expert on the subject: <>.

    When I look at C++ vs. C, another thing I look at are the volumes written by the "authors" of the language. K&R is about 1/2 inch thick. Bjarne's book on C++ is about three times as thick, and really doesn't describe the language as it has lots of: "// ..." comments, and incomplete "discussions" of the language. C you can understand from what you are looking at. Look at C++ source, and the code you need to understand the code could be many modules away, for the "+" operator even!

    Lastly, a famous quote: "The thing about standards is that there are so many of them."

    1. Anonymous Coward
      Anonymous Coward

      Bjarne's book

      Indeed - Bjarne must rank as one of the worst writers in computing and his attempts to write instructional C++ tomes are disastrous. Which leaves me wondering if he's any better at communicating his ideas to the machine than he is at communicating to humans. His code examples suggest not, by and large.

      1. Anonymous Coward
        Anonymous Coward

        what is your reading age?

        "his attempts to write instructional C++ tomes are disastrous"

        the alternative explanation is that the problem is one of comprehension.

        Set aside some time, re-read and try harder.

    2. Burb

      C vs C++

      "When I look at C++ vs. C, another thing I look at are the volumes written by the "authors" of the language. K&R is about 1/2 inch thick. Bjarne's book on C++ is about three times as thick,"

      C is a much smaller language than C++ and has a much smaller standard library. The flip side is that any sizeable application written in C will almost certainly use a home made (at least not standard) container library, will no doubt incorporate some ad-hoc mechanism for dynamic function dispatch, will be littered with macros, and approximately 50% of the code will be there to deal with error conditions and ensure things are cleaned up correctly. If it is well written, that is...

      The question is whether you want features in the language and standard library that help you manage complexity or whether you want a small language and places the burden of managing complexity on the developer.

  16. Anteaus

    C syntax here to stay?

    Years back I did a fair amount of coding in xBASE, an English-like language, and found that I could write substantial routines and often have them work first time, with ZERO bugs. I'd defy any coder to claim that to be possible In C-based languages.

    It's interesting to compare the entrenched nature of C syntax with user-interface development, where the opposite paradigm prevails. No qualms about willy-nilly change, change for change's sake, or even ill-advised change there. Yet, in coding, we're told that we MUST stick with a donkeys-years old way of entering code, and woe betide anyone who even suggests that an alternative might be better.

    It occurs to me that soon we will be communicating reliably with computers in spoken language. If we haven't modernized our coding syntax by that time, coders will effectively bar themselves from making use of such developments. Which would be one singularly ludicrous situation, would it not?

    1. Anonymous Coward
      Anonymous Coward

      "I'd defy any coder to claim that to be possible In C-based languages."

      It's possible.

    2. Ru

      "soon we will be communicating reliably with computers in spoken language"

      And which language, do you suppose, will be running the speech recording and natural language processing systems, that must do so much complex work with some soft real time performance restrictions?

      My bet is on C for the drivers and C++ for the complex work.

      "woe betide anyone who even suggests that an alternative might be better"

      If I'm paying for your development time, you will be using a language that is well supported, widely understood and appropriate for the problem domain. You may feel free to use which ever crazy moon language you like for anything you are prepared to bankroll, but I am not willing to have to pay extra for development and potentially vastly more for maintainance because a 'better' yet rarely used language was chosen.

      The word you were looking for was 'pragmatism', I believe.

    3. cyborg

      I can't communicate with people reliably in human language

      Perhaps you really haven't given as much thought to the issue of the need for precise communication as someone who has actually written a computer language? Ambiguity simply will not do and human language is full of it; English doubly so.

      I call shenanigans - making things look like human language does not solve any problem.

    4. SilentLennie


      "Years back I did a fair amount of coding in xBASE, an English-like language, and found that I could write substantial routines and often have them work first time, with ZERO bugs. I'd defy any coder to claim that to be possible In C-based languages."

      Well people do it all the time in JavaScript.

  17. W. Anderson

    Neww C++ and it's Microsoft connections

    I do sincerely hope that if Herb Sutter is amoung many C++ ISO group members from Microsoft, that the language standard is not corrupted - even subtly - as has been done to the severe detriment of other languages - like the Microsoft license of Java, or their "Iron-" iterations of Ruby, Python and PHP, as just a few examples.

    The technology industry or their clients of technology, meaning the whole world, can no longer afford - in the twenty first century to endure the draconian behavior of Microsoft in attempting - and sometimes succeeding - to force shape technology in their own mold, and ultimately breaking everything and thus setting back technological progress many years.

    Everyone is sick and tired of that.

    1. Ken Hagan Gold badge

      Re: Microsoft connections

      It's hard to prove anything, but MS dragged their heels a *lot* during the first standardisation effort (VC6->VC7) and then tried to push managed C++ on the world. Both efforts seem to have been dismal failures and I think they hurt themselves more than the language. (They have to use their own compiler to develop Windows, remember!)

      They've been good in recent years and I feel confident that the "extended subset" supported by the current MS compiler reflects their release schedules and compatibility requirements more than anything else.

      Of course, C++ is used by every other player in the industry, so it really shouldn't be surprising that the standard reflects more than Microsoft's interests. It's not like Microsoft are the only vendor and they can just switch off things if they want to, like they did with VB6 and might be doing with Silverlight.

  18. Will Godfrey Silver badge


    I rather like python

  19. Anonymous Coward

    The only language...

    ...with a proper syntax is Smalltalk. There is hardly any syntax at all, only message sends. The ultimate in language sophistication.

    I spend 1/2 my time in C++ and 1/2 in Smalltalk and it annoys the hell out of me having to type in all that syntactical noise in C++. Unfortunately Java and C# are also full of this noise too.

  20. Anonymous Coward
    Anonymous Coward

    The World runs On COBOL

    Games and Wanker Ware run on C++.

    Unless you want dancing monkeys. That's Jabba.

    1. Anonymous Coward
      Anonymous Coward


      take away powerpoint(c++) and watch as all the big companies are brought to their knees :)

      a lot of the old bulk processing is done in Cobol, on big iron, but they are old processes. The actual day to day operation, data input and output, is done via front ends running on c/c++ OSs overlaid with c/c++/c#/java/etc.

      A few very old systems may still have Cobol front ends ( I know of one in my company, but it's not used anywhere a customer could see it) but the bulk are something more modern.

      I'd be very curious to see what choices were made if any of the big banks had to recreate their core systems from scratch nowadays.

      1. jake Silver badge

        @AC 21:15 ... Power Point? Waste of (corporate) time ...

        "take away powerpoint(c++) and watch as all the big companies are brought to their knees :)"

        Uh, actually, the first thing I do when brought in as a consultant is ask "who are your Power Point experts, and who relies on their presentations within the organization?"

        And then I promptly fire the lot of 'em[1].

        Power Point has wasted more billable hours than anything else ever invented by the global corporate infrastructure ...

        [1] Yes, I have that power ... it's in the fine print in my standard contract ;-)

        1. TeeCee Gold badge


          You fire PowerPoint dweebs for a living?

          I sooooo want your job, you just can't buy the level of job satisfaction inherent in that.

          1. jake Silver badge


            "You fire PowerPoint dweebs for a living?"

            No. I clean up b0rken corporate computing culture for a living.

            "I sooooo want your job"

            You can have it. All you have to do is bid on, and win, the contract(s).

            "you just can't buy the level of job satisfaction inherent in that."

            Nah. Firing people in this economy leaves a bad taste in the mouth ...

  21. Pseu Donyme

    Hmm ...

    Still no standard library / keywords for multithreading / synchronization though ? (Something akin to Java's (Concurrent Pascal's) would have been welcome ... ).

    It seems that the basic idea (by Brinch Hansen, Hoare, ...) from the 1970s of "safe" languages, that is, trading some runtime speed for an easier implementation still fails to catch on :). While waiting for that, libmudflap (BoundsChecker,...) can be used to achieve similar results with C/C++ (for test/development), in practice, catching a bunch of bugs "free". These days many of the actual runtime environments can support these for weeding out bugs, for an embedded system without a MMU and/or the memory to spare one should be able to run (most) of the same code on a workstation for testing purposes (extra work for setting up some kind of a simulation though).

    Originally I owe the above insight to converting a 1990s real-mode x86 (DOS) C (C++) system to 286/16-bit protected mode (DOS extender providing an enviroment like 16-bit protected mode Win 3.x) . As a side effect I ended up with a system where most logical pieces of memory had an individual, HW bounds check at runtime. The amount of bugs this catched was quite phenomenal, and, importantly I was able to set up an exeception dump facility which allowed me to trace a run-time exeception, even one that would happen only every once in a blue moon back to the source line and machine instruction. With the CPU register values it was usually easy enough to figure out what had gone wrong (the typical fix being adding an explicit bounds check to array access: unchecked, these are may crash the system down the line when it is involved with some other section of code entirely ...).

    1. Ken Hagan Gold badge

      Re: Hmm

      "Still no standard library / keywords for multithreading / synchronization though ?"

      I think you'll find there *is* library support and that new keywords aren't necessary.

  22. Anonymous Coward

    Why support concurrency?

    If the language is going to support threading then why not multi-process too. Lets ditch fork() and createProcess() and have some standardised feature set. No? Why not?

    I'll tell you why not - because these are OS specific features and a 1 size fits all approach simply doesn't cut it when you need to get really down and dirty with the OS. I'll be interested to see how they merge the Windows threading model with pthreads - but I suspect a dog will be getting its dinner.

  23. Tom Reg

    C++ - way too complicated

    It's possible to write code in c++ that is completely unreadable by anyone but an expert in the language - some STL code comes to mind. It's also possible to make it readable, but I am not sure that is the norm. I wrote a large program in it 10 years ago and when that was over I swore never again. My experience is that with C++ you spend a lot time on the language and not working on the problem.

  24. Anonymous Coward
    Anonymous Coward


    Will C++0x have a consistent bit-width data types across CPUs? I bet not, given you need a language completely detached from C for that!

    Will C* code and data structures continue to break as new *-endian CPUs come out, like between 8-bit, 16-bit, 32-bit, 64-bit, X-bit CPUs and all that horrid data-type and data structure macro hacking (e.g. for Microsoft and other OS's) because there is no VM to abstract away this native hacking? I bet so!

    Will we continue to see the absurdity of buffer overflows and null pointer bugs because of the inherently unsafe memory model of the C/C++* family of 'bare metal' languages, especially in common C/C++ libraries and OS's? Of course!

    Most of the Microsoft security issues relate to this inherently unsafe memory model, the rest are higher level issues like SQL injection exploits.

    How long before there are fully usable and stable compilers and IDEs for common OS's for C++0x? 5 to 10 years, or never.

    In that time, advances in Java or other higher level languages will have rendered it even more irrelevant, maybe even for OS and driver coding.

    Objective-C is kludge which is only relevant to Apple platforms, thus irrelevant to most application developers. Apple is part of an expensive lifestyle fad and has a limited lifespan, so beware about investing much of your life with them.

    1. Anonymous Coward

      @AC 22:16 GMT

      Of course not.

      Different standard bit widths on different platforms are key for performance - which is what C and C++ are about fundamentally. The standard says that int should be the standard bit width for that platform, and that makes sense because it will be the size all the registers are designed to operate with. This is why C (and to a lesser extent C++) dominate in embedded systems.

      If you want to use types that are fixed across platforms, use int32_t, etc. (in C++0x) or the boost equivalents (before C++0x).

      Many compilers are almost ready to go with C++0x, even before it has finally been standardised. Especially gcc where most of the features have been prototyping for years.

    2. Someone Else Silver badge
      Thumb Down


      "Will C++0x have a consistent bit-width data types across CPUs? I bet not, given you need a language completely detached from C for that!"

      Say what?

      C99 supports (and has for the last dozen years!) consistent bit width types (int8_t, uint32_t, etc.). I would expect C++ to do the same (are you listening, Herb?)

      You really need to get out more...

    3. Conner_36

      "Objective-C is kludge"... really?

      "Objective-C is kludge which is only relevant to Apple platforms, thus irrelevant to most application developers. Apple is part of an expensive lifestyle fad and has a limited lifespan, so beware about investing much of your life with them."

      Them's mighty strong fightin words.

      Just a look by the numbers, ios is more popular than every current game console by the numbers. But being nice I added all of the original numbers to make it a 'fair' fight.

      We have the nintendo brand at 602.42 million since 1974; the sony playstation brand at 369.8 million since 1994; the xbox brand at 79 million consoles sold since 2001; then we have apple's ios brand at 193.62 million since 2007!

      Or nintendo at 16 million per year; sony at 21 million per year; microsoft at 7.9 million per year; apple at 48 million per year.

      So by your logic microsoft shouldn't even be trying with the kinect! They should just shut down their games division because no one likes… um wait exactly how do you judge relevant?

      -----(how I came by my numbers)-----

      xbox at 24 million

      xbox 360 at 55 million

      ps3 at 50 million

      ps2 at 150 million (includes sell in)

      playstation and ps one at 102 million

      psp at 67.8 million

      wii at 86.01 million

      3ds at 3.61 million

      ds at 146.42 million

      game cube at 21.7 million

      N64 at 32.9 million

      super nintendo at 49.1 million

      nes at 61.91 million

      entire gameboy line at 200 million

      virtual boy at 0.77 million

      all iphone versions 108.62 million

      all ipod touch 60 million

      all ipad 25 million

  25. Anonymous Coward
    Anonymous Coward

    C++ vs other languages

    I love C++, because I can write large, complex applications for some projects but if I want to bring a program up on bare metal I can do it in just a few lines of assembler. You just set your stack, call all your static constructors and that's about it.

    It's a pity because C++ v1.0 only really needed operator overloading, out-of-scope destructor calling, and the class notation. It could lose inheritance, exceptions, RTTI, STL and still be incredibly useful, even as a SIL. I'm guessing that while the C++ standards committee were dotting every I and crossing every T, huge amounts of insecure C code got written. Perhaps if they had just produced a decent string class with everything you need, like CStrings Get/ReleaseBuffer(), Python's startswith() and endwith(), and ability to construct from either unicode or ascii strings like CComBSTR, perhaps a default 1k string buffer, and then copy-on-write system when it overflows, then maybe it would be just too useful to not use and you wouldn't get all these died in the wool C hackers coming up with rather specious arguments against C++. Linus might have used it for his kernel, and a bunch of other developers wouldn't be put off by the 'space-age' features that people seem to like abusing, damaging readability and sometimes even run-time efficiency.

    I know you can get all kinds of clever things with the Boost libraries but that's not really the point. Python, Java and other languages, even if slower seem give you what you need instead of what some academic-types think you want, and that's what I see as the main problem with C++.

    I'm not looking forward to a new C++ compiler. It looks like it'll give me virtually nothing I'll use so I may as well stick with the threading abstractions I'm already using, and compilers that I know and understand for all the major platforms.

    1. Anonymous Coward
      Anonymous Coward

      Quite right

      I think Bjarne might well agree too - I remember a talk of his (sponsored by Microtec, remember them?) entitled "C++ as a better C"; in it he described the usefulness of the C++ features that match v1.0 as you describe it, pretty much.

      template metaprogramming? I get DSL, I *really* do, but ... sigh ... what's wrong with lex and yacc? *really*?

  26. vic 4

    Copyright violation?

    Eh, is the ISO standard not subject to copyright protection? Has however is hosting that pdf got permission to distribute it for free? I had to fork out about $30 for my copy many years ago.

  27. Martin

    Time to ask Verity what she thinks....

    It would be nice to see an updated version of this.

  28. Vanir

    @Tom Reg

    "My experience is that with C++ you spend a lot time on the language and not working on the problem."

    My experience is that programmers work on the problem and the code at the same time - that's the problem.

    1. Anonymous Coward
      Anonymous Coward


      "My experience is that programmers work on the problem and the code at the same time - that's the problem."

      Sometimes you don't know what the problems will be until you start to write and test the code.

      Sure, you can nail down the low level design and all the potential issues for some noddy 1000

      line project , but its a different story when scaled up to 100,000 lines with a dozen coders.

  29. Field Marshal Von Krakenfart
    Paris Hilton


    The Tiobe survey is complete ++ungood

    Cobol only marginally more popular than Fortran, cobblers!

    Paris, the only compile-her that can deal suckessfully with dangling pointers

  30. Anonymous Coward

    Remember the reason for C++

    There are a lot of people on this forum bitching about issues with C++, where really they mean issues with C. Things like buffer overruns, NULL pointer problems, etc.

    I invite all those people to stay with their managed languages. The reason C++ is still so hugely popular is simple. It is the only language where you can get raw speed combined with object orientation. It also has the best generics solution of any language. Since the standardisation of templates in 1998, other languages have been playing catch-up. Templates and operator overloading were what persuaded me C++ was the language to use back in 1992, since then the raw speed has kept me there.

    In return for the raw speed there are lots of things you have to think about which you can ignore in other languages. Memory management and raw pointers are the main ones. But you know what, these are the reasons you can get such blazing speed out of C++. To quote Spideman, "With great power comes great responsibility."

    There are a lot of things I'm looking forward to in C++0x. The article concentrated on the memory model, which I think is good because this is one of the things we have all been waiting for. Rvalue references and move constructors are very interesting because they essentially allow you to write even faster code, especially when using a class with overloaded operators. Type inference (the auto keyword) will help to make a lot of template code more readable. It also makes some templatised code possible where it wasn't before. Range based for will make working with the STL more comfortable. There are a bunch of template improvements I'm looking forward to: variadic templates and template aliases are the most obvious.

    My biggest concern is not about security vulnerabilities. It is actually that C++ is getting difficult to learn. When I was a student in the 90s everyone learned C or C++. That was good because difficult concepts like pointers were covered. These days lots of universities teach Java and specifically ignore everything to do with pointers. I don't mind that too much since it means that there is an army of programmers capable of doing routine run of the mill coding. It does mean we are seeing less C++ specialists, and as such high standard C++ programmers are becoming harder to find. What I have started to do is pick out the very best Java or C# programmers, and re-tread them into C++ programmers!

    1. Anonymous Coward


      To quote Spideman, "With great power comes great responsibility."

      A lot of people seem to forget a couple of points.

      There are a lot of C++ 'coders' out there who still rely on the C elements of the language, who'll use raw arrays instead of an STL collection etc.... This is language abuse. Of course it causes problems when you are actually writing code in one language (C) but calling it another C++

      You don't blame the car when a drunk driver has an accident do you?

      For instance there was a famous class in an application developed by an ancient team at a place I worked. This one 'class' was 10 print pages long

      Also you need to use the right tool for the job. When modelling vast, complex real world problems, such as mobile phone networks for example. You don't want to be trying to relate a bunch of raw arrays with each other. A well developed OO data model provides an easy interface for the the higher levels of the application to deal with.

      We redeveloped one such application which although supposedly written in C++ did not make any use of a coherent OO data model possibly because it had to pass this data onto a discrete optimiser. This meant that adding a brand new technology to the network required a massive redevelopment. So to avoid having to do the same thing the next time a new mobile protocol was invented we re-designed.

      By making use of templates, inheritance and polymorphism our data model could be viewed by the higher levels as a mobile phone network, but the actual optimiser only saw it as a collection of discrete numbers with degrees of freedom to be manipulated. Because of this multiple perspective view of the data we were able to negate the need to translate the higher level data to a view the optimiser could understand. This resulted in a 30% speed up which really makes a difference on an optimisation that can several weeks to run. Not something we could have achieved with Java or .net I believe.

      It turned out there were quite a few 'c++' developers in the company who had never used templates and almost never utilised polymorphism or overloading and so found the solution very difficult to understand. We also supplied highly detailed UML diagrams & documentation of the model and there were also a lot of developers who didn't really understand them either.

      The language though did it's job and provided us the tools to produce an efficient, extendable solution to a complex problem. In general It is the level of knowledge of the development community that lets the language down.

      No doubt there will be C++0x programmers coding in C++ (and still probably C) for many years to come.

      1. Anonymous Coward
        Anonymous Coward

        Re: Absolutely

        "There are a lot of C++ 'coders' out there who still rely on the C elements of the language, who'll use raw arrays instead of an STL collection etc.... This is language abuse. Of course it causes problems when you are actually writing code in one language (C) but calling it another C++"

        I actually agree with quite a lot of what you say, but I don't think using C features in C++ is "language abuse". Raw arrays are actually very useful to get access to the blistering speed of the OS. The nice thing about raw arrays is that you can use them just like an STL collection. In fact in C++0x you can even throw a raw array at the range for and it just works.

        Obviously, most people wanting to use an array could get equally good results from an std::vector, but if your data absolutely has to live on the stack for performance reasons then a raw array is fine. Take a look at the implementation of boost.variant for example. Under the hood the storage is a raw char array (with some alignment foo).

    2. Mason Wheeler

      Fact check

      >It is the only language where you can get raw speed combined with object orientation.

      You've obviously never used Delphi. (Which also offers a syntax that mere mortals can understand.)

      >It also has the best generics solution of any language.

      Excuse me? C++ templates have got to be the worst generics solution ever invented. Take a code-generation scheme that, by definition, has to be evaluated to completion by the compiler, and make it Turing-complete? Yes, what a wonderful idea!

  31. ForthIsNotDead
    Thumb Up


    Forth is still around. Does everything you need, and close to the hardware as you'll get without resorting to assembly.

    1. Anonymous Coward
      Anonymous Coward


      wouldn't bring up the first spin of the hw any other way

  32. Yet Another Anonymous coward Silver badge

    There is a world outside Cobol



    Is probably easier to put in as a formula.

    1. Anonymous Coward
      Anonymous Coward

      I get it, so what you're saying is ...

      Different tools for different jobs! Each taken on their own merits?


      jake, are you there? are you listening?

      1. jake Silver badge

        @AC21:13 ... Where did I say I didn't use different tools for different jobs?

        I was commenting on real programming, not the likes of "Farmville" ... If the underlying hardware+OS isn't stable, none of your toys, bells & whistles will mean squat.

  33. Anonymous Coward

    "buffer vulnerabilities ... underlying Intel architecture "

    Rubbish. Nothing to do with x86 architecture.

    Give me a processor and an OS that will (1) allow me to write stuff to the heap (or maybe elsewhere) as data (2) allow me to execute that stuff as instructions. I'm all set for a vulnerability.

    There are lots of processor/OS combinations that permit that kind of thing. Common OSes on x86 are some of them but by no means all of them.

    Go back to CompSci 101 and start again.

    Tom Welsh: nice to see you here again. You should pop by more often.

    Mentions of Occam: just goes to show that there's very little in the world of programming that hasn't already been done, and sometimes done well. Communicating Sequential Processes. Pools and channels. What else could possibly be necessary?

  34. Robinson

    Yea, ok....

    And you people who say the world isn't built on C++, but is instead built on Java, Python, Cobol, or whatever. which language do you think was used to create all of those? C/C++ is the foundation stone of pretty much everything IT, even your mobile device.

    1. Oninoshiko

      Hello Mr. Wells,

      Cobol first appeared in 1959, C didn't appear until 1973.

      So either you have access to a time machine, or have no idea about the history of computing.

    2. Anonymous Coward

      also ...

      Robinson: you also appear not to have heard of self-booting compilers. Its possible to build a compiler for any language in the language its intended to compile. Indeed a complete language should be able to build itself. OS's are like that too.

      But I'm guessing compiler theory and machine code isn't probably your strong suit.

      Here's a quick backgrounder:

      1. Robinson

        Chicken and Egg

        "you also appear not to have heard of self-booting compilers."

        You need to read the section entitled, "the chicken and egg problem".

        1. h4rm0ny

          Re; Chicken and Egg

          Chicken and Egg may sound like an insoluable problem in the abstract, and yet the world contains both chickens and eggs somehow. It is possible to write a compiler for a language in the language itself. Well maybe not in Python, but you can in C.

          1. Anonymous Coward
            Anonymous Coward

            "Chicken and Egg"

            The reality is that you have to cross-compile or have an equivalent means of generating the 'first' object code such as running the compiler source on an interpreter to compile itself. How else ?

            1. Anonymous Coward
              Anonymous Coward

              The equivalent means being ...

              To have an assembler, coupled with a suitable parser and rule table for your chosen syntax, which then emits many machine instructions per statement. Thus, surely, any programming language can concocted? Even DOS debug could do that (God help us).

              Probably coma or stroke-inducing but strangely quite satisfying in the end.

              We used to do a lot of this when I was a nipper. That and making bitmaps from a bus ticket and a darning needle borrowed from my gran.

  35. Anonymous Coward
    Anonymous Coward

    C++: an octopus made by nailing extra legs to a dog

    "PL/I and Ada started out with all the bloat, were very daunting languages, and got bad reputations (deservedly). C++ has shown that if you slowly bloat up a language over a period of years, people don't seem to mind as much." -- James Hague

    And for even more C++ quotes, see:

  36. Anonymous Coward
    Anonymous Coward

    Portability is being forgotten here

    Sutter's claim that the standard gave you portability is revisionist history. To conform to a standard, you need a good set of tests; to benefit from a standard, you need compilers that pass them. Plum-Hall may have had the former but most compilers fell woefully short on the latter. Substantial rework was required going from one compiler to another -- even on the same architecture. It was not pleasant.

    I have not programmed in C++ for a while so I do not even know if conformance results are even available nowadays. Back in the day, Sun's compiler was closest, followed by Borland.

  37. Liam Shepherd

    "Quicker, cleaner, Java-ier"

    Oh great, so now every application is going to start taking 10 times as much memory....

This topic is closed for new posts.

Other stories you might like