back to article Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year

C++ 20, the latest version of the venerable object-oriented programming language, has been unanimously endorsed in ISO's final technical approval ballot. It awaits a "final round of editorial work," following which it will be formally published, expected to be "towards the end of 2020." A major new version of C++ comes every …

  1. Anonymous Coward
    Boffin

    Is C++ becoming too large and complex?

    Yes, 15 years ago.

    1. jake Silver badge

      Re: Is C++ becoming too large and complex?

      I'd say about 22 years ago ...

      1. EVP
        Pint

        Re: Is C++ becoming too large and complex?

        May I do a couple of bitwise left shifts on the number of upwotes you receive? Assuming MSB left.

    2. Dan 55 Silver badge

      Re: Is C++ becoming too large and complex?

      You don't have to remember all the language, you can forget the bits which are superseded by something newer, better, and/or simpler. E.g. C++11 really simplified a load of stuff.

      1. Blank Reg

        Re: Is C++ becoming too large and complex?

        Unless you're working on long-lived lagacy code that makes use of all the latest and greatest C++ features from every release since the dawn of time.

        1. Anonymous Coward
          Anonymous Coward

          Re: Is C++ becoming too large and complex?

          Don’t get me started on the number of features on Ccrapcrap. When I first encountered Ccrapcrap, I thought I was a bit thick when I couldn’t remember all the gazillion ways to do a simple task. Over the years, I realized that it’s not me but that can of worms. Yeah, it must be much better now after 30+ years of development and all, but Ccrapcrap is still Windows of programming languages. The whole thing is not only a PITA, but Big-Unlubricated-Throbbing PITA.

          Bring it on.

          1. Someone Else Silver badge

            Re: Is C++ becoming too large and complex?

            When I first encountered Ccrapcrap, I thought I was a bit thick when I couldn’t remember all the gazillion ways to do a simple task

            You were probably right. Someone a bit less thick would only need to remember the easiest and most straightforward way to a simple task, and then just lather, rinse, repeat....

      2. Ian Joyner Bronze badge

        Re: Is C++ becoming too large and complex?

        The problem with not knowing the whole language is that thing you forgot will come and hit you in the head. This is because of non-orthogonality.

        See my comment on 'Evolution' to see this is not a good thing.

        1. Dan 55 Silver badge

          Re: Is C++ becoming too large and complex?

          So you would rather C++ be unchanging, forever, born in a state of perfection meaning no changes are necessary to the language in the decades ahead to meet programmers' needs?

          All you need to do is look at the relative numbers of users of Eiffel, Smalltalk, Simula 67, or whatever other perfect language you care to mention to see how well that would work out. C++ would turn into one of those languages or another version of Objective C - nobody uses them unless they're forced to.

          1. Anonymous Coward
            Happy

            Re: Is C++ becoming too large and complex?

            So you would rather C++ be unchanging, forever, born in a state of perfection meaning no changes are necessary to the language in the decades ahead to meet programmers' needs?

            Better to dump obsolescence periodically and start again.

            That's why we don't all just use Fortran with loads of new features added.

          2. Ian Joyner Bronze badge

            Re: Is C++ becoming too large and complex?

            "So you would rather C++ be unchanging".

            I don't see what you are getting at at all. Languages should be well designed in the first place. Even well-designed languages can undergo change. Even Eiffel has. Alan Kay has said he expected that Smalltalk would have been improved on by now.

            But C++ started in a particularly bad place and has needed massive improvements, the to follow all the machinations of those 'improvements'. The excuse that C++ is evolving and becoming better is a pathetic excuse for what was not a very good language in the first place.

            C++ rode on the popularity of C, and the popularity of C is also misplaced.

            1. Dan 55 Silver badge

              Re: Is C++ becoming too large and complex?

              As far as I can tell there was no other language around which was as popular as C was which could cope with software of operating-system level complexity, certainly none of the examples of languages given here. So C was probably the best place for C++ to start.

              If you're saying the popularity of C was/is misplaced then you don't understand why C was and is popular. You still can't write operating systems in Pascal, at one time that supposedly was a contender for C's crown.

              C++ built upon this foundation. It was not perfect, but it got results, exactly like C, which is why C and C++ are popular. Other languages might have feature A, B, and C but not all of them. If they're useful then C++ will incorporate them and carry on. The other languages certainly won't be in any hurry to incorporate each other's features, let alone features from C++.

              C++ does not aspire to be some perfect representation of an object-oriented language. There is no perfect language, not C++ and not the rest. The others either 1) have barely managed to get out the academic stage, trapped in constant debates about the best way to do things with a moribund user base or 2) are scripted or need a JIT VM or 3) are owned by some megacorp. C++ is the only language which is not owned by anyone, compiles into object code, and has new features which the user base are pushing for. The ones that are getting somewhere (Go, Rust) are getting somewhere due to their similarity to C++.

              1. AddieM

                Re: Is C++ becoming too large and complex?

                Can't write an operating system in Pascal? I beg to differ.

                https://en.wikipedia.org/wiki/Classic_Mac_OS

                1. Dan 55 Silver badge

                  Re: Is C++ becoming too large and complex?

                  Pascal could manage a glorified bootstrapper with a GUI, which I guess is an improvement in MS-DOS, but Apple moved away from Pascal with the move towards co-operative multitasking and PowerPC.

                  See also Why Pascal is Not My Favorite Programming Language.

                  1. Ian Joyner Bronze badge

                    Re: Is C++ becoming too large and complex?

                    Dan 55: >>Pascal could manage a glorified bootstrapper with a GUI, which I guess is an improvement in MS-DOS, but Apple moved away from Pascal with the move towards co-operative multitasking and PowerPC.

                    See also Why Pascal is Not My Favorite Programming Language.<<

                    Apple moved to NeXT and OS X. So?

                    And the Kernighan paper – most of it I agree with. But most C people take it too widely and think that it applies to any language with Pascal-like syntax, and that is an excuse to remain in the C black hole. Kernighan's paper is not an excuse just to ignore everything else – it is about some very specific things about Pascal, very notably having array sizes as part of a type, which is wrong.

                    Kernighan's paper is not permission or exhortation to just ignore everything else.

                    Many of Kernighan's complaints were addressed in commercial Pascal implementations.

                    1. Dan 55 Silver badge

                      Re: Is C++ becoming too large and complex?

                      Nope, I was talking just about Pascal.

                      Apple moved away from a toy language as the moved away from a toy operating system.

                      The problem is that there is no standard in commercial Pascal implementations, each one solved the many and varied problems in their own way meaning there was no portability, while the language itself remained pristine and unsullied and unusable so the academics were happy.

                      Meanwhile, in the real world, C and C++ evolved, and all compilers supported the changes (even VC, eventually).

                      1. Ian Joyner Bronze badge

                        Re: Is C++ becoming too large and complex?

                        Dan 55: "while the language itself remained pristine and unsullied and unusable so the academics were happy.

                        Meanwhile, in the real world"

                        You have this false division between academic and real world. While you persist in that false belief and distinction there is no discussion to be had with you. You live in some kind of 'fool's hell'.

                        Programming is an intellectual activity. It is based on computational models and virtual machines, some of which are implemented in electrical circuits.

                        You just want to feel like you can dismiss anything else as not being 'real world'. That is a narrow and silly view.

                        Computing must improve and move on and if that is a move to academic purity, that is a good thing.

                  2. Ian Joyner Bronze badge

                    Re: Is C++ becoming too large and complex?

                    Dan 55 "See also Why Pascal is Not My Favorite Programming Language."

                    There is another thing here that C people treat whatever Kernighan, Ritchie, and Stroustrup have written as some kind of holy scripture and last word on the subject.

                    Like I said, Kernighan has a lot of things right in that paper (which I have read several times over the years), but it is not the last word, and Pascal wasn't where that stream of languages ended. Like most things, people moved on from Pascal. But C and C++ seem to attract the kind of people who refuse to move on.

                    1. Dan 55 Silver badge

                      Re: Is C++ becoming too large and complex?

                      Pascal came up because you were saying the popularity of C was misplaced. I was trying to get the point across that there was no contemporary competition to C which could do the things C could do. At that time, Pascal was C's supposed competition.

                      I don't know, if you don't like C++ and even don't like C but cite languages like ALGOL and Pascal in other posts, it seems it's not me who's refusing to move on.

                      1. Ian Joyner Bronze badge

                        Re: Is C++ becoming too large and complex?

                        " was trying to get the point across that there was no contemporary competition to C "

                        You are living in the narrow world of C. There were plenty of other languages around when C came out. In the 1960s there was much language activity. C people like you like to take the view that C had no contenders. It did.

                        "I don't know, if you don't like C++ and even don't like C but cite languages like ALGOL and Pascal in other posts"

                        Because again that was to contradict your contention there was nothing else. ALGOL was an ancestor of C, and yet, in many ways better. Better and rigorously defined. C was a step backwards. ALGOL was improved by CPL, ALGOL68, Pascal and beyond. C and C++ have become stuck. Even though C++ might seem to have improved with new versions that is still trying to fix mistakes, not any real improvement.

                        I have more than moved on, but we should not ignore the lessons from the late 1950s and beyond because those people understood the foundations of computing that is lacking in today's practitioners who think programming is down to a couple of flawed languages.

                        1. Dan 55 Silver badge

                          Re: Is C++ becoming too large and complex?

                          C didn't even exist in the 1960s and neither did Pascal! That link was from the 1980s when both were about a decade old and people had worked out each language's weaknesses.

                          Anyway, I should move on, as you say. Do keep carrying on talking about the 1950s.

                          1. Ian Joyner Bronze badge

                            Re: Is C++ becoming too large and complex?

                            "Do keep carrying on talking about the 1950s."

                            What I'm suggesting is we should not forget the language ambitions of those times. They were very far sighted. For example Christopher Strachey's CPL, which could not be implemented at the time, so Martin Richards did BCPL, out of which C came.

                            But C was a compromise at the time both because it was done on limited machines, and because it did not understand the vision of the other languages.

                            So anything based on C is going to fall short of what could be done in languages. C++ tried to fix some of that, but really just added its own level of cruft.

                            Yes, Dan 55, do move on. The rest of us have.

            2. Man inna barrel

              Re: Is C++ becoming too large and complex?

              Languages get messy because life is messy.

              C is actually a very good language, because it provides just enough abstraction from the underlying hardware, but no more than that. If you have ever migrated an embedded software application from device-specific assembler to C, you would appreciate just how useful a C compiler is.

              At the opposite end of the abstraction scale, I have experimented with various higher level languages, particularly functional languages such as Haskell. They are mostly elegant, fantastic at abstraction, and totally useless for any real work. I did manage to get some statistical analysis done in Ocaml, which was quite neat.

              With C++, I am still learning C++11, so I am way behind. I lapped up many of the new concepts, but I still have a lot to learn. This does not appear to be a problem. I could still write code in older style C++ if I wanted to. I could write my code without range-for loops or auto declared types. I am not forced to use these features, but they are handy once you learn about them.

              These complications to the language fix some problems, or simplify writing routine code. I used to write a lot of iterate-over-the-container stuff using the old for loop and iterators ideom. Pretty verbose, and occasionally prone to errors. The simpler range-for syntax makes the code more readable and maintainable. I found the auto type declaration method quite friendly, having got used to type inference in functional languages.

              1. Ian Joyner Bronze badge

                Re: Is C++ becoming too large and complex?

                Man inna barrel

                Re: Is C++ becoming too large and complex?

                "Languages get messy because life is messy."

                No, that is absolutely the wrong approach to languages.

                Yes the problems are complex. The complexity (essential) should remain in the problems. We should solve the problems with simple tools and paradoxically that is true, not "Languages get messy because life is messy." When the languages get messy that is accidental complexity and you end up wasting a lot of time dealing with that complexity that is not necessary.

                "C is actually a very good language, because it provides just enough abstraction from the underlying hardware"

                Which underlying hardware are you talking about? This is actually not true, but one of the simplistic maxims of C. I have ported large C systems (from well-known vendors) to platforms that C does not abstract well at all. C needs its own sandbox in that case. Mind you I knew the internals of that C compiler (written in Pascal!) and it was very well implemented.

                "With C++, I am still learning C++11, so I am way behind. I lapped up many of the new concepts, but I still have a lot to learn."

                Well, why this process of learning each step of C++. This is why I say to understand C++, you actually need to understand the history and evolution, much more than any other language.

        2. Tom 7

          Re: Is C++ becoming too large and complex?

          I've always found computing to be non-orthogonal. OK I grew up with lots of different languages but C and C++ were my main weapons and C++ has its history writ large in its syntax but its all logical as far as I can see. There's a rational use-case for everything in it, it might not be your rational use-case but you are free to write your own language which will magically become as complex as C++ once you've added solutions to all the rational use-cases and as 'ugly' to the majority of users.

          If a language is simple enough for you to know all of it then its not complicated enough for modern use. If you are using bits of a language you dont understand RTFM.

          1. Ian Joyner Bronze badge

            Re: Is C++ becoming too large and complex?

            "If a language is simple enough for you to know all of it then its not complicated enough for modern use."

            That is complete nonsense and completely misunderstands computing, programming, and languages.

            Languages and tools should be simple – it is the problems we express in those that may be complex.

            Complex solutions are most often wrong. And that is C++.

            1. Anonymous Coward
              Devil

              Re: Is C++ becoming too large and complex?

              > Complex solutions are most often wrong. And that is C++.

              I am sorry C++ didn't work out for you. You could always try PHP.

    3. Duncan Macdonald

      Re: Is C++ becoming too large and complex?

      AGREED

      The biggest problem with C++ is that it is too easy to make "write only code" ie code that no one except the author understands. The overloading of operators often makes it difficult to read a routine and understand it without reading the whole source (including all included files). With more contained languages (FORTRAN, C, BASIC etc) a subroutine could normally be understood on its own without needing to read thousands of lines in other routines. (When Z=X+Y; can mean anything from a simple addition of two numbers to merging two arrays depending on the types of X, Y and Z following the source gets difficult.) In my opinion a well written program is one that another programmer can easily take over without relying on external documentation. C++ makes it too easy to write programs that no one except the author can understand. (Sometimes even the original author can not understand the program!!!)

      1. Anonymous Coward
        Anonymous Coward

        Re: Is C++ becoming too large and complex?

        I've seen people write C++ code where everything is operated overloaded everywhere, because they wanted to make sure that the person who replaced them eventually would go insane trying to understand what was happening and how.

      2. Someone Else Silver badge

        @Duncan Macdonald -- Re: Is C++ becoming too large and complex?

        Interesting. In your first sentence, you lament that C++ makes it easy to write "write only code"...then only two sentences later, hold up FORTRAN, C and BASIC as counterexamples.

        Don't bogart that joint, my friend....

        BTW, "write-only code" is not a function of the language, but of the language practitioner. Remember, you can write FORTRAN in any language....

      3. Abominator

        Re: Is C++ becoming too large and complex?

        You have not used PERL then.

    4. Tom 7

      Re: Is C++ becoming too large and complex?

      That will be because computing is too large and complex for you.

    5. Anonymous Coward
      Go

      Re: Is C++ becoming too large and complex?

      No language can really be considered complete unless it has at least half a dozen different ways to declare and initialise a variable.

      int a = 42;

      int a(42);

      int a{42}; (but don't use {} when you mean (), and vice versa)

      auto a = 42;

      auto a = int(42);

      auto a = int{42}; (ditto)

      Have I missed any?

      1. Anonymous Coward
        Anonymous Coward

        Possibly...

        The additional complexity with things like std::vector and std::initializer_list?

      2. Dan 55 Silver badge

        Re: Is C++ becoming too large and complex?

        Well, you've missed what auto is for.

        1. Anonymous Coward
          Anonymous Coward

          Really?

          Q: Do the following have the same behavior?

          A a1 { 0, 1 };

          A a2 ( 0, 1 );

          A: It depends on what constructors are provided. They will not if (e.g.) the following signatures are available:

          A::A( std::initializer_list< std::size_t > );

          A::A( std::size_t, std::size_t );

          How does auto help there?

          1. Dan 55 Silver badge

            Re: Really?

            It wouldn't, because a1 wouldn't compile. Your point being that auto is terrible because you can't sprinkle it everywhere or something?

          2. Anonymous Coward
            Trollface

            Re: Really?

            It wouldn't, because a1 wouldn't compile.

            Why do you say that? It works for me.

            #include <cstddef>

            #include <initializer_list>

            class A {

            public:

            A(std::size_t,std::size_t) {}

            A(std::initializer_list<std::size_t>) {}

            };

            int main() {

            A a1{0,1};

            A a2(0,1);

            }

            $ g++ --std=c++2a -Wall a.cpp

            $

            1. Dan 55 Silver badge

              Re: Really?

              "auto a1{0,1};" won't compile from C++17 onwards. AC was asking how auto helps.

              1. Anonymous Coward
                Meh

                Re: Really?

                "auto a1{0,1};" won't compile from C++17 onwards. AC was asking how auto helps.

                "auto a1{0,1};" can never compile, because the compiler can never deduce what type a1 is. You'd need "auto a1 = A{0,1};"

                And if the initializer_list version of the constructor has been declared, unwary programmers will then wonder why their programs don't work as expected.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Really?

                  Exactly - code that doesn't compile is never a problem, but code that does and behaves in an unexpected way is.

        2. Anonymous Coward
          Anonymous Coward

          Re: Is C++ becoming too large and complex?

          Well, you've missed what auto is for.

          No, I haven't missed what auto is for. I do a great deal of C++ programming, and given the choice I usually follow Herb Sutter's AAA style because it should be the compiler's job to deal with tedious details like what types to use.

          In fact, I wish that auto wasn't broken in places - for example:

          struct Fred { auto i = 43; };

          Despite all the C++ programming I do, I still think it has become a bloated dog's dinner of a programming language.

          1. Kristian Walsh Silver badge

            Re: Is C++ becoming too large and complex?

            struct Fred { auto i = 43; };

            Why would you ever want to do this? If you're not able to be specific about the datatypes used in a DECLARATION, then maybe look at using Python or something. No typed language with type inference allows this kind of "yah, whatever you're having yourself" declaration.

            If I were maintaining that code, even if that syntax were possible, I would switch it back to int, so that the intention of the programmer is clearly visible.

            Or, to put it another way, what I like about C++ is that I don't have to deduce what could be in a data-structures by looking at field assignments.

            1. Anonymous Coward
              Happy

              Re: Is C++ becoming too large and complex?

              Why would you ever want to do this? If you're not able to be specific about the datatypes used in a DECLARATION, then maybe look at using Python or something. No typed language with type inference allows this kind of "yah, whatever you're having yourself" declaration.

              Because that is the whole point of auto - deduction of a type from an initializer. It is for exactly that purpose - to allow you not to be specific. And can you make sure you get the return type of foo() right, even though it is dependent upon the machine architecture and the foo header file/library version - your code has got to run everywhere!

              #include <foo.h>

              int main() {

              struct Fred { auto x = foo(); };

              auto y = foo();

              }

              The first auto won't compile, but the second will. Madness.

              1. Kristian Walsh Silver badge

                Re: Is C++ becoming too large and complex?

                No madness, just perhaps a misunderstanding of what a declaration is for, and an attempt to misuse a keyword for its side-effects.

                Because that is the whole point of auto - deduction of a type from an initializer. It is for exactly that purpose - to allow you not to be specific.

                You can't keep hand-waving all the way to the the metal. At some point, you have to be specific, and for a variety of reasons, allocated class member types are one of those places.

                Incidentally, you are allowed to use auto to declare a field of a class/struct, but only where that field is both static and const (and the initializer is a constexpr or literal) static const fields are special, though, allocated and assigned just once and shared by every instance (sometimes they don't even get allocated unless your code tries to get the field's address). This single point of initialisation may give you an idea why they can easily be allowed to use auto, while class-members are not... hint: consider how you'd declare a constructor that sets your Fred::x field.

                Even if it didn't cause problems for code generation, your desired solution would only work for fields that have a default initializer, and thus it is only half a solution for the legitimate requirement of needing a field that can always contain the return-value of a given function. Luckily, the people standardising the language did consider this, and so if you need a field type that always matches the return type of a function, you should be using decltype(), which can be used regardless of initialization status. (auto can be seen as a special-case of decltype() for use when declaring variables and constants)

                struct Fred { decltype("gotcha") x; decltype(foo()) y = foo(); };

                ... and this syntax also allows you to write a constructor, or any other function, whose argument type will match the return-type of a function, or match the type of any other declared constant.

  2. Dinanziame Silver badge
    Go

    An important part of this evolution is that some features of the language should never be used, ever again. They might remain here for backward compatibility, but their use should gradually be eliminated, and if possible be removed altogether, like trigraphs (hallelujah).

    1. mevets

      Supplimentarly...

      Many of the new features should never be used, ever. Remove the +'s, and you have a competent, core language.

      1. Anonymous Coward
        Anonymous Coward

        "Competent, core language"

        If you mean 'C', then that is far from perfect with one of the main drawbacks being the unwillingness of WG14 to ever break legacy code. For example, the relational operators still return type int even though the language has had bool (though that is just an optional alias for _Bool) since 1999.

        There are many other areas that would benefit from improvements - character handling and I/O, for example, where it is very, very easy to fall into undefined-behaviour (though that happens in lots of other places as well). However, any improvement would break that mythical program that was written pre-standardization that is still in use.

        1. jake Silver badge

          Re: "Competent, core language"

          You are quite correct, C is not a perfect language. There aren't any.

          However, C is a very good first approximation of a systems programming language. In fact, it is such a good first approximation that nobody has bothered with a second, at least not with any great degree of sincerity.

          Why not? Simple answer: Running code trumps all.

          1. Ian Joyner Bronze badge

            Re: "Competent, core language"

            "You are quite correct, C is not a perfect language. There aren't any."

            So you can invent any old garbage you want because you can always say others have small flaws? That is a non sequitur.

            "However, C is a very good first approximation of a systems programming language."

            Actually, C is a terrible systems language. A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job. They probably don't even know that is their job.

            "In fact, it is such a good first approximation that nobody has bothered with a second"

            Burroughs system languages blow C out of the water and make C look like the toy it is. I know your response will be 'who uses Burroughs'. But the one thing about Burroughs system languages is that you only use them for the OS, or middleware, not everywhere as is expected of C.

            Also Burroughs languages were around half a decade before C. Did you ever stop to wonder where the idea of writing system software in structured languages comes from, and where #define came from? From Burroughs, from an idea Donald Knuth gave them.

            "at least not with any great degree of sincerity"

            Oh, Burroughs languages are far more sincere than C. There have been many more sincere languages than C or C++, but they get roundly bashed by C/C++ people. Others are getting sick of C and C++ – they are now old and tired.

            1. Dan 55 Silver badge

              Re: "Competent, core language"

              By Burrows system languages you mean ALGOL. And then you complain that C++ is old and tired.

              1. Ian Joyner Bronze badge

                Re: "Competent, core language"

                "By Burrows system languages you mean ALGOL. And then you complain that C++ is old and tired."

                Well, you don't even bother to spell Burroughs right.

                ALGOL is still a more modern language than C or C++. But ALGOL moved on. It moved to Simula, ALGOL-68, Pascal, and then many more. C and C++ are just stuck in the same quagmire.

                The worst technologies lead to lock in. That was the case with IBM, and certainly is now true for C and C++. The others have moved on.

                Even so, the Burroughs languages make C look like a toy.

                1. Dan 55 Silver badge

                  Re: "Competent, core language"

                  Twas my autocorrect, but Burrows as in head in the stand might be appropriate.

                  Pascal is nice to teach programming with, but can't be used in the real world. Delphi and Lazarus are not standard Pascal because standard Pascal is too limiting, but even so you still can't write something like an OS with them.

                  Seriously not getting your complaint about C/C++ being old, stuck in a quagmire, etc... and then going on to quote languages which are fossilised or only of interest inside academia.

                  1. Anonymous Coward
                    Anonymous Coward

                    'you still can't write something like an OS with them'

                    Just because the compilers and linkers were never written with that in mind, thereby they don't output the required file formats. There's nothing in the language forbidding it.

                    Anyway they got 'modules', aka 'units', more than thirty years before C++...

                    1. Dan 55 Silver badge

                      Re: 'you still can't write something like an OS with them'

                      If it's standard Pascal then units aren't compiled separately but end up in one monolithic executable. You might be looking for library (non-standard).

                  2. Ian Joyner Bronze badge

                    Re: "Competent, core language"

                    "Pascal is nice to teach programming with, but can't be used in the real world"

                    Absolutely wrong. I have extensively used Pascal. Now most universities had Pascal IMPLEMENTATIONS that only needed to support students. Both Apple and Burroughs had real-world implementations of Pascal that were extremely useful.

                    That is not to deny that Pascal's type system was a bit severe and better has been done since. But that does not excuse C's almost complete lack of types apart from int, float (over BCPL). But Wirth moved on to take the lessons from Pascal and create Modula and Oberon.

                    Others took lessons from Simula and created Smalltalk and Eiffel. C++ took Simula and corrupted it. I don't think at the time Stroustrup had much clue.

                    The world moved on, but C remained in a time warp along with its followers who believe these myths that nice looking languages are just that way for teaching.

                    "Seriously not getting your complaint about C/C++ being old, stuck in a quagmire, etc... and then going on to quote languages which are fossilised or only of interest inside academia."

                    You prove in that your lack of understanding and narrow view of computing. You need to get rid of this false classification of languages into real world vs teaching or academia. C and C++ have become fossilised even though C++ comes out with a new standard every three years – but that is still trying to fix the mistakes and move towards those other languages that you consider 'fossilised'.

                    And you should speak for yourself. Many people in the industry are now questioning using this complex language where the complexity is accidental – that is part of the language itself, rather than essential complexity, which is part of the problem.

                    Your response and other responses are based on the cult of C-think of 'other languages are just for beginners', etc. It cannot be simpler – you are wrong.

                    1. Dan 55 Silver badge

                      Re: "Competent, core language"

                      None of the languages you cite are getting traction in the commercial world and you miss all the langagues that provide real competition like Java or are up-and-coming like Go and Rust. But apparently I am wrong.

                      1. Ian Joyner Bronze badge

                        Re: "Competent, core language"

                        "None of the languages you cite are getting traction in the commercial world and you miss all the langagues that provide real competition like Java or are up-and-coming like Go and Rust. But apparently I am wrong."

                        See that is what always happens, people like Dan 55 fall back on "oh it's not used widely". That is the problem of today's populism. People like Dan 55 can't break out of the mould and too scared to admit the flaws in C and C++ and can only keep beating the same drum.

                        1. Dan 55 Silver badge
                          Coffee/keyboard

                          Re: "Competent, core language"

                          Hilarious. Keep on claiming Smalltalk, ALGOL, Simula, Smalltalk, Eiffel, Pascal, Oberon, etc... are all perfectly designed, that C/C++ is terribly flawed beyond redemption, that C/C++ programmers don't know they can shoot themselves in the foot, that real-world usage which is so little it doesn't get all of these languages combined out of the "Others" column means nothing and certainly not languages which are just of academic interest, and that actual real-world usage to solve real-world problems is populism. Just don't do it while I have my morning coffee please.

                          1. GrumpenKraut

                            Re: "Competent, core language"

                            Dan 55: you are talking to a brick. See his comment history.

                          2. Ian Joyner Bronze badge

                            Re: "Competent, core language"

                            "Hilarious. Keep on claiming Smalltalk, ALGOL, Simula, Smalltalk, Eiffel, Pascal, Oberon, etc... are all perfectly designed"

                            No, I did not say that. However, it does not matter what people think – C++ is a terribly designed language.

                            Dan 55 is descending to ranting.

            2. Doctor Syntax Silver badge

              Re: "Competent, core language"

              "A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job."

              I think there's a couple of non sequiturs there.

              1. Ian Joyner Bronze badge

                Re: "Competent, core language"

                Doctor Syntax:

                Re: "Competent, core language"

                ""A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job."

                I think there's a couple of non sequiturs there."

                You are wrong. And you don't even bother to note what the non sequiturs are.

                I'll say it again, because it is right:

                A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job

                And more than failed to do their job – it probably means they don't even understand what their job is.

                1. jake Silver badge

                  Re: "Competent, core language"

                  Wait, what? If a systems language is used for applications, the systems programmers have failed to do their jobs? How on Earth is it the system programmer's fault if an application programmer chooses an/the incorrect language for the application? And how the fuck does it follow that the system programmer doesn't understand their job, based of what some other person does?

                  Your illogic is mind boggling.

                  1. Ian Joyner Bronze badge

                    Re: "Competent, core language"

                    Jake: "Wait, what? If a systems language is used for applications, the systems programmers have failed to do their jobs? How on Earth is it the system programmer's fault if an application programmer chooses an/the incorrect language for the application? And how the fuck does it follow that the system programmer doesn't understand their job, based of what some other person does?

                    Your illogic is mind boggling."

                    OK, you don't understand what the fundamental goal of system programming is. That is to provide a platform free of the underlying considerations of hardware and 'bare metal' as C programmers love to call it. That is any hardware timing concerns that might affect the outcome of computations are handled. Any other foibles of the hardware platform. Those details are abstracted away.

                    Similarly in distributed systems, middleware abstracts from the differences between systems to provide a single consistent view.

                    Put even more simply, the goal of any level of software is to provide a strong and consistent abstraction.

                    When system programmers think other programmers should be using C, exposing the lower level details that should have been handled, it means the system programmers have not done their job.

                    I'm sorry, if I wasn't clear on that before, and I hope that explains it to you.

                    1. mevets

                      Re: "Competent, core language"

                      Bollocks; I've been paid to write kernel code for the past 30 years. It is precisely the ability to have a predictable response from operations that distinguishes a language useful for manipulating low level facilities from one that isn't.

                      Case in point, an Ottawa firm, once renowned as a tech darling for its pbxes, voicemail and lawnmowers, wrote the entire system in a lightly extended Pascal in the early 80s. Hopelessly stuck, the eventual solution by early 2000 was to run a 68k emulator on a SPARC server -- there was no other reasonable path to delivering the functionality on modern hardware.

                      About the same time as that company had started doing its thing, Sun had been rolling out 68k workstations. By the time the emulated solution was delivered, Solaris had shipped on several 68k editions, 4 editions of SPARC (including a 64bit), and moved from 1 cpu to 64 physical processors, all writtenin C plus a small bit of machine language. In 2001, for kicks, I ran /bin/ls from a 1988 SPARC-1 compiled binary on a 64-bit E10k; it still worked.

                      If 'application' developers choose a lower level language, the fault is clearly in the insufficiency of application level languages; be it conceptual innanity, runtime overhead, quality of implementation - a trifecta best called C++. Mercifully languages like Go may provide an antidote.

          2. Doctor Syntax Silver badge

            Re: "Competent, core language"

            "it is such a good first approximation that nobody has bothered with a second"

            If you count BCPL & B then it's the 3rd. But it looks like that's it.

          3. Anonymous Coward
            Joke

            Re: "Competent, core language"

            "it is such a good first approximation that nobody has bothered with a second"

            Rust!

            1. jake Silver badge

              Re: "Competent, core language"

              I don't trust Rust. It's difficult to write proper Fortran and COBOL in it ... which is not exactly a good sign for a language that claims to be "multi-paradigm".

              Besides, its intentionally as much of a clusterfuck as C++ ... and for all the same reasons.

  3. Jay Lenovo
    Angel

    ...and that C++ 23 will be better still

    I would certainly hope so.

    Software is seemingly always one update from better,

    until it's not.

    1. This post has been deleted by its author

    2. Anonymous Coward
      Anonymous Coward

      Re: ...and that C++ 23 will be better still

      Are they going to add the 'Kitchen Sink' operator as well then?

      1. Anonymous Coward
        Anonymous Coward

        Re: ...and that C++ 23 will be better still

        I think they added that in 14 tbh. Now they're building an extension with marble kitchen tops and a jacussi.

      2. Someone Else Silver badge

        Re: ...and that C++ 23 will be better still

        Are they going to add the 'Kitchen Sink' operator as well then?

        I'm still waiting for the 'comefrom' statement...

  4. A.P. Veening Silver badge

    Three-way comparison - long overdue

    There are many other new features, including the “spaceship operator” <=> which does a three-way comparison and returns less, equal or more.

    As the title already says, long overdue. It was already available in RPG some 60 years ago.

    1. Ken Shabby
      Headmaster

      Re: Three-way comparison - long overdue

      FORTRAN Arithmetic IF, from the late 1950's

      IF (expression) negative,zero,positive

    2. Tom 7

      Re: Three-way comparison - long overdue

      But others will tell you that's too complex.

    3. Anonymous Coward
      Anonymous Coward

      Re: Three-way comparison - long overdue

      Given that X <=> Y can be written as (X > Y ? 1 : (X < Y ? -1 : 0)) it was hardly a vital addition to the language. Its yet more syntactic sugar C++ didn't need.

      1. Someone Else Silver badge

        Re: Three-way comparison - long overdue

        If you think that eye-and-brain-pollution expressed as (X > Y ? 1 : (X < Y ? -1 : 0)) is good coding, you don't (and will never) get Stroustrop's point about evolving the language to make it easier (and clearer) .

        Of course, you can still continue to write that antediluvian garbage, and continue to dislocate your shoulder patting yourself on the back about what fine code that is. It's nice that the framers continue to allow one to rest on his/her nettles laurels.

        1. Anonymous Coward
          Anonymous Coward

          Re: Three-way comparison - long overdue

          Unless you're a complete numpty and/or you're new to C/C++ it is pretty damn clear. What would you prefer, a load of if-else clauses that look pretty? You stick to BASIC or whatever simple syntax language floats your boat and let the grown ups carry on with C++.

  5. James Anderson

    Nearly 30 years on

    And still trying to get it right.

    Problem : Bloated over complex language ...

    Solution : More complexity.

  6. Ian Joyner Bronze badge

    C++ – never classy

    It has been over 30 years since the original 'C with Classes' which was just a bunch of #defines to emulate classes of better OO languages (mainly Simula). I did the same with ALGOL (using Burroughs define … #, where defines came from in the first place from Don Knuth's suggestion), but really it was not a good way to do things.

    The whole idea of merging OO with C was not classy, but tasteless. More than 30 years later they are still trying to get this language right. I contend IT NEVER WILL BE.

    Instead of getting things right C++ just perpetuates the things that are wrong – the things that are wrong with both system programming based on C and application programming (which should have nothing to do with C).

    C made compromises back in 1969, excusably because they wanted things that would work in a small environment. That soon became irrelevant. Other compromises were inexcusable.

    C++ tries to fix some of these compromises but perpetuates others. To do this is not at all classy, but tasteless.

    1. Dan 55 Silver badge

      Re: C++ – never classy

      Then again, a language that doesn't change is dead, that goes for both computing and spoken languages.

      1. Ian Joyner Bronze badge

        Re: C++ – never classy

        "Then again, a language that doesn't change is dead"

        I agree that languages can improve. But C++ changes to fix the mistakes – that is different and not a measure of anything good.

        "that goes for both computing and spoken languages"

        No that is not a good analogy. Programming languages are very different to spoken languages.

        1. Dan 55 Silver badge

          Re: C++ – never classy

          Can't see why. They both express concepts and change to express new concepts... or die from lack of use.

          1. Ian Joyner Bronze badge

            Re: C++ – never classy

            "Can't see why. They both express concepts and change to express new concepts... or die from lack of use."

            Because you seem locked in to the C-think cult. C changes very little.

            https://www.quora.com/Will-C-get-new-features-in-the-future

            C++ just keeps trying to fix the flaws.

            http://trevorjim.com/c-and-cplusplus-are-not-context-free/

            https://cacm.acm.org/magazines/2018/7/229036-c-is-not-a-low-level-language/fulltext

            https://dl.acm.org/doi/pdf/10.5555/1352383.1352420

            Like most bad technologies, they become black holes of lock in. With other languages and technologies, the practitioners seem to be able to move on.

            1. Dan 55 Silver badge

              Re: C++ – never classy

              You seem to have searched for "things I don't like about C" in Google and pasted a bunch of links.

      2. Someone Else Silver badge

        Re: C++ – never classy

        ...and that includes French....

      3. This post has been deleted by its author

    2. mevets

      Re: C++ – never classy

      The logical successor is Go, except for the lowest level machine interfaces in which C + asm shine. C++ 1.2 (circa 1990) was the last reasonable version, even still was riddled with `half thought out ideas, implemented half well` to paraphrase a luminary.

      1. Ian Joyner Bronze badge

        Re: C++ – never classy

        "except for the lowest level machine interfaces in which C + asm shine"

        Not at all. C is now constraining and hobbling processor design in the same way it has hobbled programming for a long time.

        https://cacm.acm.org/magazines/2018/7/229036-c-is-not-a-low-level-language/fulltext

    3. Someone Else Silver badge
      Pint

      Re: C++ – never classy

      (using Burroughs define … #, where defines came from in the first place from Don Knuth's suggestion)

      Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules....

      (Here, have another... -----> )

      1. Ian Joyner Bronze badge

        Re: C++ – never classy

        Someone Else: "Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules...."

        No, the Burroughs defines were argued about when they were first included. They were much more powerful than the weak C defines, with several levels.

        It was very much text-based processing along the lines of the General Purpose Macrogenerator (which one of my language teachers worked on).

        https://en.wikipedia.org/wiki/General-purpose_macro_processor#General_Purpose_Macrogenerator

        Amazing what you find. Here is an article by Andrew Herbert (who I knew at ISO ODP meetings) on GPM:

        https://www.computerconservationsociety.org/software/elliott903/more903/Manuals/CCS%20Resurrection%20GPM%20Article.pdf

        Strachey then developed CPL, which became BCPL, B, and C.

        A couple of us Burroughs people did a language called SDL on the Apple II which was a cut down (because it had to be) of Burroughs and Elliott ALGOL. The main language designer did not have define # in it – he invented a rather neat macro mechanism to directly put in 6502 assembler code for things not worth doing in the language. However, in a bigger system (which we have these days) you would put such systems stuff directly in the language.

        Now here's the thing. People who have used such other languages and systems seem to be able to move on. Sadly the same is not true for C and C++ people who seem to become rusted on to the deficiencies and bad ideas in C and C++.

        1. Someone Else Silver badge

          Re: C++ – never classy

          Now here's the thing. People who have used such other languages and systems seem to be able to move on.

          Yes, and they seem to have moved on to C and C++....

          1. Ian Joyner Bronze badge

            Re: C++ – never classy

            Me: "Now here's the thing. People who have used such other languages and systems seem to be able to move on."

            "Yes, and they seem to have moved on to C and C++...."

            Like all matter gets sucked into a black hole.

            Lock in is the second-worst problem in the industry after security – which C and C++ are also very bad at. It is not a good thing and the state of programming is not in good shape.

      2. Ian Joyner Bronze badge

        Re: C++ – never classy

        Someone Else: "Modules"

        Modules? Isn't that what different files did with separate compilation, or maybe namespaces.

        In another post you said just don't use things if you don't like them and you don't need to know all the history.

        If modules really replace #define then you need to know you should use modules instead and why you should not use #define. It is all history and having to know that about C++ – it has this thing, but don't use it.

      3. Ian Joyner Bronze badge

        Re: C++ – never classy

        Someone Else: "Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules...."

        Wait a minute modules replace #include, not #define.

        1. Anonymous Coward
          Anonymous Coward

          Re: C++ – never classy

          Now there is an alternative to #include, there is no reason to use #defines either as the last use case, guard defines, are rendered redundant. Obvs.

          It seems to me you know very little about the object of your vitriol.

          1. Ian Joyner Bronze badge

            Re: C++ – never classy

            "It seems to me you know very little about the object of your vitriol.". This is the typical response to try to make legitimate problems of C++ as vitriol on the part of those who raise them,

            "Now there is an alternative to #include, there is no reason to use #defines either as the last use case, guard defines, are rendered redundant. Obvs."

            No, they are orthogonal. You confused #include with #define in the first place.

            1. Anonymous Coward
              Anonymous Coward

              Re: C++ – never classy

              Funny, every other post here you're arguing C++ is non-orthogonal (whatever that is, who gives a toss). And you've confused me with another poster.

              1. jake Silver badge

                Re: C++ – never classy

                To be perfectly fair to Mr. Joyner, all you ACs look pretty much alike.

  7. Ian Joyner Bronze badge

    Coroutines?

    "Coroutines, functions that “can suspend execution to be resumed later,” used for asynchronous programming."

    Simula 67 had coroutines. Yes, back in 1967. More than 50 years for C++ to catch up. Why did Stroustrup try to take the ideas of Simula and apply them to C?

    1. Anonymous Coward
      Anonymous Coward

      Re: Coroutines?

      Good luck doing systems or device driver programming with Simula. The whole point of C++ was an OO systems programming language leveraging the advantages of C and providing alternatives for the bits that weren't so good and allowing already written C libraries to be included in a C++ program.

      1. Ian Joyner Bronze badge

        Re: Coroutines?

        "C++ was an OO systems programming language leveraging the advantages of C and providing alternatives"

        Most programming is NOT systems programming. The fact that systems programmers think all other programmers should be using C and C++ indicates the systems programmers have NOT done their job (they probably don't even understand that is their job).

        Simula (like any language) can very simply be extended for system facilities – system facilities are actually very small and they should be limited to small areas of an OS and device drivers.

        C and C++ are completely the wrong approach.

  8. Stjalodbaer

    Object

    Like most programming languages, it has no formal logical or mathematical basis and is just a bundle of nostrums, as it seemed good in someone’s judgement. The lack of definition leads to uncontrolled growth.

    1. Ian Joyner Bronze badge

      Re: Object

      OO does have some reasonable definitions. Particularly Eiffel has a better basis:

      OOSC (Eiffel)

      For a more detailed look at software development for professionals, but at least first section will explain what we should be trying to do (which is programming, not coding).

      https://archive.eiffel.com/doc/manuals/technology/oosc/page.html

      Cardelli and Abadi have done a more mathematical basis for OO:

      http://lucacardelli.name/Papers/PrimObjSemLICS.A4.pdf

      https://dl.acm.org/doi/book/10.5555/547964

      Of course relational database have a stronger mathematical foundation, that of relational algebra and calculus, but with OO you can build your own algebras in a class. So OO is more general than relational.

      Christopher Strachey was trying to give language development a more formal approach. He developed CPL which was done as a temporary cut down version in BCPL (Martin Richards), which was adopted as the base of C (cut down BCPL).

      https://www.cs.cmu.edu/~crary/819-f09/Strachey67.pdf

      However, I see little hope for C++.

    2. Dan 55 Silver badge

      Re: Object

      On the other hand I saw Z in uni. And never again.

      1. Ian Joyner Bronze badge

        Re: Object

        I agree you don't want to program in Z. However, there is a lot to be learnt from it. I have met Jean-Raymond Abrial when he was a guest of Bertrand Meyer at an OO conference. He was Meyer's teacher and much of Z and its ideas made it into Eiffel, particularly Design by Contract.

        Contracts still won't be in C++ until 2023, maybe. A generation of programmers have died waiting for C++ just to get it wrong (again).

    3. RobLang

      Re: Object

      You can get uncontrolled growth even with defined axioms. Mathematics isn't a static field of study. Programming languages exist for their utility; not for intellectual genitalia waving - which is why I don't code in Z.

      1. jake Silver badge

        Re: Object

        You don't code in Z for one very important reason: Nobody else does.

      2. Ian Joyner Bronze badge

        Re: Object

        "not for intellectual genitalia waving - which is why I don't code in Z"

        Sorry, but programming is an intellectual activity. We reason about and model the real world. So I think that is a silly comment.

        1. find users who cut cat tail

          Re: Object

          Design of algorithms, data structures, numerical methods, … – intellectual activities.

          Programming – making things that do something. This goal can coexist with others, but you are always building a thing that does something.

          1. Ian Joyner Bronze badge

            Re: Object

            "Programming – making things that do something. This goal can coexist with others, but you are always building a thing that does something."

            And in computing that is an intellectual activity. Oh, I get it – C and C++ encourage this kind of anti-intellectual mentality.

    4. Someone Else Silver badge

      @Stjalodbaer-- Re: Object

      Like most programming languages, it has no formal logical or mathematical basis and is just a bundle of nostrums, as it seemed good in someone’s judgement.

      So a language that can be formally defined by BNF " has no formal logical or mathematical basis", and is "just a bundle of nostrums"?

      Wow! Just plain Wow...

      So what constitutes a "formal logical or mathematical basis", in your nsHO?

      1. Ian Joyner Bronze badge

        Re: @Stjalodbaer-- Object

        @Stjalodbaer-- Re: Object

        "Like most programming languages, it has no formal logical or mathematical basis and is just a bundle of nostrums, as it seemed good in someone’s judgement."

        Someone Else : "So a language that can be formally defined by BNF " has no formal logical or mathematical basis", and is "just a bundle of nostrums"?"

        Wow! Just plain Wow...

        So what constitutes a "formal logical or mathematical basis", in your nsHO?

        BNF is only for syntax. A language should also be semantically rigorously defined. The problem with C++ is that it lacked this at the beginning. Strachey (the designer of CPL, the forerunner of C) did work on denotational semantics.

        And as far as you can shoehorn C++ into BNF, that does not actually prove a formal syntax. That was retrofitted as well.

        http://trevorjim.com/c-and-cplusplus-are-not-context-free/

        Why talk about these language lawyer and formal things? Because if you get them right the languages are simpler for everyday programmers. When they are wrong you get all sorts of problems and headaches. Complexity is a pain and C++ inflicts a lot of that.

        1. Someone Else Silver badge

          Re: @Stjalodbaer-- Object

          I didn't ask you. (Unless, of course, your other handle is 'Stjalodbaer'....)

          1. jake Silver badge

            Re: @Stjalodbaer-- Object

            "I didn't ask you."

            Then take it private. This is a public forum.

  9. Ian Joyner Bronze badge

    Contracts

    "Contracts, a feature once slated for C++ 20, has been moved out of this release and into a study group, as they were not ready."

    Still not there. Like multiple inheritance and generics, C++ is copying contracts from Eiffel. Eiffel has had contracts for 30 years. Eiffel did most things right over 30 years ago. C++ is still trying to do badly most of the things Eiffel did cleanly in the first place. Eiffel took a more integrated approach to the whole of the software development lifecycle.

    The clean and constrained generics of Eiffel became the horror of templates in C++.

    https://www.eiffel.com/values/design-by-contract/introduction/

    1. Someone Else Silver badge

      Re: Contracts

      Seems like the ISO committee doesn't want to go off half cocked and turn C++ into another Eiffel. Forethought beats willy-waving any day of the week.

  10. Ian Joyner Bronze badge

    Too large and complex

    "Is C++ becoming too large and complex?

    It is an intimidating language"

    It passed that a long time ago. Programming is based on a few simple notions. Those who think it must be complex and build complex languages are going down the wrong path.

    The two languages that are both good languages but define the extremes of OO are Smalltalk and Eiffel. Smalltalk as how simple can you get with the notion of messaging, and Eiffel which is sophisticated and implements everything cleanly from the ground up, whereas C++ is kludging everything in.

    "Stroustrup said that C++ 20 is “the best approximation of C++ ideals so far”, and that C++ 23 will be better still."

    The ideal of C++ has always been just to perpetuate C++. The ideals of programming and OO are already there in Smalltalk and Eiffel and people should look at those languages to understand programming and OO.

    1. Doctor Syntax Silver badge

      Re: Too large and complex

      There a huge body of software written in C and its derivatives (including Java). How do Smalltalk and Eiffel compare?

      1. Ian Joyner Bronze badge

        Re: Too large and complex

        "There a huge body of software written in C and its derivatives (including Java)."

        Popularity is not a measure of quality. Neither is the misguided belief in the industry that C and derivates are good and all else bad.

  11. Ian Joyner Bronze badge

    Evolving

    A fundamental problem with C++ is practitioners need to understand the history and evolution of C++. I have two books here on my desk "The Evolution of C++" edited by Jim Waldo, and Stroustrup's own "The Design and Evolution of C++". While it is interesting to know the background and thinking in any language, C++ is the one language that demands understanding the evolution – why is this or that construct there, what you should use and what you should avoid.

    Different people have different interpretations, likes and dislikes. Different shops will develop their own style rules based on their understandings of C++.

    C++ as an evolving language is not a good thing. It is an excuse that C++ still has not got things right – things that were got right 50 years ago in Smalltalk and 30 years ago in Eiffel and other languages, that C and C++ people continually dismiss (and not very nicely).

    C++ is like going back 100,000 years and saying there are modern humans, but let's go back a million years and evolve from there. It is time the computing industry lost patience with the approach of C++.

    New programmers should absolutely not be taught C++ (or even C, or Java). The fundamentals of programming and computing should be taught with clean languages – and clean languages with semantic checks are not just beginner's languages with 'training wheels'. Semantic checks are a most advanced aspect of computing, not just to be dismissed as training wheels. Also simple languages can be used for all levels of programming.

    Programming can use the simplest of tools to build the most complex of systems. C++ is a tool that thinks it must have the complexity built into the tool. This is fundamentally wrong.

    1. timrowledge

      Re: Evolving

      Agree with almost everything you’ve written in this thread.

      C++ was a bad idea and too complex right from the start. All it ever did was let lazy C users write a file with a cpp extension and claim “hey, I’m programming objects!”

      It would be nice if any ‘new’ language had actually learned from Smalltalk, which is *still*, after 50 years, they only language good enough to be worth critiquing.

      And finally

      “New programmers should absolutely not be taught C++ “

      I’d argue that new programmers should be taught documenting before any sort of programming language! Along with actually reading in order to understand specifications and requirements.

      1. Ian Joyner Bronze badge

        Re: Evolving

        Timrowledge:

        Yes, I quite agree.

        "C users write a file with a cpp extension and claim “hey, I’m programming objects!”"

        This is a very good observation and no less than David Parnas said it somewhere as "the worst thing about OO is that people think they are doing good programming just by using an OO language" – maybe not in those exact words. Anyway, he was making the point it is the clean and enforced design of APIs that is important, that is defining all possible interactions of objects.

        With C++ you have pointers which can subvert the published APIs of objects.

        "I’d argue that new programmers should be taught documenting before any sort of programming language!"

        Well, a programming language is actually a documentation language, but what makes it better is that a programming languages is an alive document. That is one of the great things about programming.

  12. Ian Joyner Bronze badge

    C++ fails by its own measures

    "Good C++ code should be easy to understand, according to Stroustrup. “One of the measures of good code is that I can understand it,” he said."

    1. hammarbtyp

      Re: C++ fails by its own measures

      "I have always wished for my computer to be as easy to use as my telephone; my wish has come true because I can no longer figure out how to use my telephone".

      Stroustrup 1990

  13. cantankerous swineherd

    I'm wondering what the difference between a module and a library is.

    1. Dan 55 Silver badge

      You don't need header files meaning you don't need to include the same file multiple times from different places in the code, you don't have to worry about include order, and you don't need to mess around with guard defines so headers can be included multiple times.

      1. Someone Else Silver badge

        Oh, that we have that in Python....

  14. Sceptic Tank Silver badge
    Thumb Up

    I ♥ C++

    I just had to say that.

    Thank you.

    1. Anonymous Coward
      Happy

      Re: I ♥ C++

      And, in a very complex way, it ♥ you too.

  15. Lee D Silver badge

    God, I'm glad I don't go near C++, it's increasingly getting to become such a nightmare that I don't even understand some of the new features at all. That "concepts" things just sounds like utter nonsense to me.

    And though the "three-way-comparison" thing sound good at first, I just see a nightmare of overloaded operators in its future.

    All the C++ code I see is generally nothing more than overly complex C code shoved into some paradigm that makes little sense in the circumstances and often generates far more problems for maintainability than it does actually resolve any readability or "object" resolution issues.

    I have literally abandoned contributing to some OS projects when they went from C to C++ for no fathomable reason beyond "Hey, I know C++, so you guys all need to learn it too", and then littered the code with tons of unnecessary furniture while leaving 99% of the actual logic exactly as it always was, but with horrendously complex access to simple variables.

    I get it, I really do, I remember reading my first books on C++ many, many years ago and it all clicked and I thought that it was amazing and wonderful and so full of potential. And everything I've seen since was a disaster of unreadable code with unpredictable consequences. "C with classes", I would pay you money for. C++ I wouldn't give you tuppence for. And as it evolves it just seems to get worse and worse, and that's when the compilers fully support all the new stuff (which they never do).

    There was a time where you could pick up a programming language, learn it in its entirety in a few weeks at absolute worst (for a competent programmer), and then be able to read and understand every bit of code ever written in that language - if not in intent and programmer choice, than at least in syntax and outcome and general gist.

    I find that C++ is just a cryptic cipher.

    And that's someone who's modified their own Linux kernel code (not just "applied a patch" but actually written code and fixed bugs that nobody else had on obscure hardware), ported other's applications between OS and architectures, written their own programs, etc.

    Gimme C99 and a decent gcc any day to all this nonsense. No wonder people are jumping on Go and Rust and everything else (though they are increasingly starting to follow the "just shove the feature in and worry about how people are supposed to read it later" philosophy too).

    1. Someone Else Silver badge

      God, I'm glad I don't go near C++, it's increasingly getting to become such a nightmare that I don't even understand some of the new features at all.

      Your peers are glad you don't go near C++, too. For the same reason.

    2. Nick Ryan Silver badge

      I'm glad that I'm not the only one who looks at the code (new paradigms) and just sees unintelligable extended character set spaghetti. It's not necessary. Seriously. Code should be legible, clear and not obscured in as many arcane ways as possible.

      I am more than capable of writing assember code, and have done for many years, but seeing unintelligible code in any language just makes me sigh. Code reuse is good, compiler hinting is good, but obfuscation even if it's excused as "you should know every arcane illogical and backwards obvious operator for the last 30 years" is not good at all.

  16. Ross 12

    Some of these features sound like it's taking ideas from Rust

    1. Ian Joyner Bronze badge

      "Some of these features sound like it's taking ideas from Rust"

      Most of C++ came from Eiffel – multiple inheritance (C++ got that wrong), templates from generics (C++ got that wrong), contracts – oh wait until 2023 for C++ to get that wrong.

  17. _LC_
    Stop

    C++ seems to generate a lot of hate among the people who failed to lean it properly

    Whenever there is something new to be reported about C++ the forums are full of incompetent hate posts.

    This reminds me a lot of people, who cannot swim talking to me about the dangers of water bodies...

    1. _LC_

      Re: C++ seems to generate a lot of hate among the people who failed to lean it properly

      "learn" that is, because I know they will jump on typos like those. :-(

    2. Ian Joyner Bronze badge

      Re: C++ seems to generate a lot of hate among the people who failed to lean it properly

      Don't excuse exposing what a terrible language C++ is as 'hate posts'. C++ being a terrible language has nothing to do with hate, because it is.

      "This reminds me a lot of people, who cannot swim talking to me about the dangers of water bodies..."

      No, that analogy does not apply to programming languages. The warnings about problems in languages are technically based and have technical merit – and C++ gives many opportunities for criticism.

      Those criticisms are most likely from people who understand programming much better than you do, like a olympic gold medal swimmer saying swimming over Niagara Falls will most likely end in death. Now that is a good analogy.

    3. Nick Ryan Silver badge

      Re: C++ seems to generate a lot of hate among the people who failed to lean it properly

      The same is true of PHP. PHP is a very flawed language, with lots of ridiculous language constructs that appear to have been borne out of the designer hearing about a feature in another language but not understanding enough to implement it properly. On the other hand, its flexibility is why it's so useful at times and the same goes for C and C++.

      The more capable, and often the more flexible, a language is the more unintelligible the code can be made. Generally, code should never be written this way and should always be as clear as possible even disregarding "clever" language constructs. In the end, clarity wins.

  18. TripodBrandy

    Backwards compatiblity

    > He has advocated starting with the “simple and elegant features” and not worrying about parts of the language which may be there only for backward compatibility

    Why not define which features in the language spec are there for backwards compatibility only, so you can use only the subset of "good" features. Then you could have a compiler switch to disable them, e.g. -std=c++20only which would fail to compile programs that use the old/"bad" features.

    1. nintendoeats

      Re: Backwards compatiblity

      I am currently learning C++. I have wished for this. For example, I only just learned why visual studio keeps trying to replace #define with constexpr.

      Though that said, part of my code also needs to compile in C++98. This really highlights one of the strengths of the language; if you write a C++ program that just uses POSIX and the STL, you sure do have a lot of valid build-targets out there.

  19. poohbear

    Please keep it away from flying machines.

  20. Anonymous Coward
    Anonymous Coward

    The old problem of the programmer perisope view of the world..

    Pretty much all the comments above seem to fall into the common fallacy - my professional experience is writing a very specific very narrow X type of software therefor my opinion is universally applicable. Well, because.

    Well my opinion is based on shipping mass market, high volume commercial software in an exceptionally wide range of market segments for many decades. So I am only interested in low failure, high performance, highly maintainable codebases.

    If you want fast and simple its C. Because C is generic Macro Assembler. At least well written C is. The stuff that wont fail in the field. If you are not happy writing Macro Assembler for a wide range of processors you will never be comfortable writing C.

    C++ code will only be stable and maintainable if the codebase (minus the container classes) can be compiled by CFront. All the other stuff is not only not needed but has no complete grammar or fully formalized specification so you will have compiler bugs. Unless you have not only isolated but fixed a compiler bug in your codebase you are not really a C++ programmer. Found and fixed first C++ compiler bug in 1989, most recent one, last year. If you dont know about compiler bugs you are not a serious C++ programmer.

    If you have learned OOP in a proper OO language like Smalltalk or Java or even CLOS Lisp you can write OO software in any language. I've been writing and shipping C-- (C with OO architecture) for decades. When needed. It is very small and very fast. You can write complete, stable and shippable codebases in very disciplined CFront C++. Not one single bell or whistle added to the C++ spec in the last 30 years has made writing clean stable OO code any simpler. But they have made writing complex, convoluted and unreadable codebases a lot easier. With some wonderfully obscure compiler bugs.

    The only advantage to the use of all the C++ language trivia is that it makes identifying marginal and incomplete programmers a lot easier. If you inherit a buggy codebase with C++98/C++13 etc "features" you know just throw it away. Because it is faster to rewrite something that will work than spend many times longer trying to fix code that will never work properly. Courtesy of all those language "gadgets".

    So for programmers who dont have to ship commercial software that works in high user volume, high resource stress environments you are welcome to yet more language trivia which will give you hours of idle enjoyment. To those of us who have to ship stable software its just one more layer of crap that has to be ripped out sooner or later. Because we know how those language "gadgets" work. What the compiler is actually trying to do. On the bare iron. And sometimes fails. Which is why we dont use it. Why add another layer of potential failure to a codebase when it can be easily avoided.

    1. Dan 55 Silver badge

      Re: The old problem of the programmer perisope view of the world..

      If you have learned OOP in a proper OO language like Smalltalk or Java

      I stopped there. Multiple inheritence is too scary I take it? Fine, but then don't talk about proper OOP.

      1. Anonymous Coward
        Anonymous Coward

        Re: The old problem of the programmer perisope view of the world..

        > Multiple inheritence is too scary ...

        One. Do you actually know how MI is implemented by a compiler? If so you would never ever use it in code you want to ship commercially. And yes, I actually have written commercial compilers. That shipped. For OOL's. So I actually do know how it works. It makes you very wary and very careful.

        Two. I have never ever seen any implementation scenarios in well architected *commercial* code bases in many decades of professional programming were MI was not a completely f*cking stupid idea. Rip it out and use one of the several very robust alternatives. Which have a much lower run time overhead. Again, if you knew how the compilers worked you would have known this. MI is the type of stuff written by inexperienced amateurs.

        So I suggest you go back to your academic textbook MI examples. Because that is the only place you will ever see them. Because in real world codebases the only time you will see MI style theoretical "OO" academic stupidities is in unstable bug ridden codebases where the only practical solution is throw it out and start again. Because if they dont the product eventually dies. Seen that happen in real life on multiple occasions. Starting with the great C++ STL 1.0 templates fiascos in the mid 90's. That either killed or caused the rewrite of a lot of commercial codebases. It took them over a decade to get compilers to stable compile templates in a consistent fashion and even today still very easy to get junk results as the end product. Diassemed much .exe in your professional career? Quite a revelation.

        So how did you learn your "real" OOP skills? From some TA who never wrote a line of shipped code? Because in my experience those are usually the only people who think stuff like MI is either important or relevant. Those of us in the the trenches shipping product most certainly dont. We have product to ship and most academic OO bright ideas are just utterly useless shite. Purely for the amusement of the language anoraks.

        1. Dan 55 Silver badge

          Re: The old problem of the programmer perisope view of the world..

          Er, no, my experience of MI is in the commercial world. And it compiled and ran and passed QA and was accepted by the customer and everything.

          1. Someone Else Silver badge

            Re: The old problem of the programmer perisope view of the world..

            In other words, Mr. Holier-than-thou AC(oward), Dan 55 has written working code. In C++. Multiple times1.

            You might want to look at this to get some perspective..

            1Well, to be fair, I do not know this for sure, but given the number of Dan 55's posts I've read over the years, I believe this is a very good assumption. I'm sure Dan 55 will correct me if I'm wrong....

        2. Ian Joyner Bronze badge

          Re: The old problem of the programmer perisope view of the world..

          "> Multiple inheritence is too scary ...

          One. Do you actually know how M"

          I'm not sure who 'Anonymous Coward' is addressing here. However, the fundamental thing wrong in this answer is the ad hominem attack on 'academia'. This is a problem in C think that 'oh we deal in the real world, those academics only give small examples'.

          That is because academics see a problem and boil it down to a small example that demonstrates the problem, so it is easily understood. They should not feel that they need to apologise for that. In fact, academics have considered programming in the wide and wild, far wider than most real-world practitioners. So I am very against this faulty C thinking that 'oh things are very nice in academia, but we know what really goes on'. That is garbage thinking.

          "So I suggest you go back to your academic textbook MI examples. Because that is the only place you will ever see them."

          Textbooks also boil it down to small examples. Orthogonality means that many elements can be combined. Non-orthogonality means things combine in surprising ways – I don't mean with pleasant outcomes, quite the contrary. Now these 'real-world' practitioners might think that non-orthogonality is a necessity in their 'real world', but it is not.

          In fact, saying there is this difference between 'real world' and academia is wrong. Programming is not in the real world – it is in the virtual world of electronic circuits. Virtual worlds are what software is about. Software does not deal with physical things – that is why it is really powerful and easy to change and flexible.

          Those 'real world' difficulties and complexity are because someone did not understand this along the way, did not guard against non-orthogonality.

          "So how did you learn your "real" OOP skills? From some TA who never wrote a line of shipped code?"

          Again I don't know who 'AC' is addressing the remarks to, but this again is an example of the false attitude. As an example and general refutation, my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.

          Alan Kay noted, when he coined the term 'OO' he did not have C++ in mind. So accusing others who criticise C++ as just being taught by some 'TA' is nonsense.

          "Because in my experience those are usually the only people who think stuff like MI is either important or relevant. Those of us in the the trenches shipping product most certainly don't."

          You don't understand MI, or have seen it badly applied (it is easy to abuse MI). But MI allows you to break a system down into even smaller classes and then to recombine those abstract classes in very flexible ways. That means a concept is not buried in some other larger class that you then have to repeat that code somewhere else.

          As for 'trenches' – again, that attitude coming up because you are working within badly designed systems.

          1. Someone Else Silver badge

            Re: The old problem of the programmer perisope view of the world..

            "Look! Actual code!"

    2. Lee D Silver badge

      Re: The old problem of the programmer perisope view of the world..

      I love the way that you destroy your own argument. So as long as you're the kind of programmer who wants to spend their time finding and fixing obscure compiler bugs, not to mention stripping out all mentions of anything in the standards newer than 22 years old, and consider anyone who doesn't to be a twat and to throw their code away, then C++ is a great language.

      But "my professional experience is writing a very specific very narrow X type of software therefor my opinion is universally applicable".

      You - and the guy above with a similar concern - miss one point.

      At the end of the day, it's a language. Languages have to be understandable. Computer languages, especially, need to be understandable and unambiguous. When people start using niche features in ways that modify the expected outcomes of simple statements drastically, understandability goes out the window, and you end up with a billion "accents", but not just ones where certain words are inferrable from context or where you could compensate via the grammar for not understanding a particular word.

      No. Accents that totally and radically modify the meaning of the code and are unfathomable, untranslatable, and yet perfectly valid code.

      It's like saying that English and Italian are the same language because we have some Latin roots in both.

      The irony being that all mainstream programming languages are written in English (in terms of keywords etc.) but yet C++ appears to be written in gibberish. You literally stand more chance of understanding a random coder plucked from the world's spoken language than you do their C++ code.

      And the "fix" from the expert that tells you not to program in small isolated niches is "throw all that junk out and program in a version of the language that's 20-years-old, because that's the last time it worked properly - apart from all those compiler bugs that I have to fix myself".

      Now I'm a great fan of C99, I consider it "definitive" in terms of C coding, and coding styles documents for all kinds of projects tend to agree with me (not to mention very specific compiler switches). But when we have to stick with something 20+ years old to make a language serviceable, yet developers are taught and grow up with features like this that are in all mainstream compilers and expect them to be present, and people continue pushing more junk into it, it's time to call it something else and move on.

      Quite literally: "C18 addressed defects in C11 without introducing new language features" and C11 only added things that weren't present in the core language (e.g. multithreading primitives), standardising a few de-facto headers (e.g. complex number types), and removing stuff that was dangerous (gets). 19 years of evolution consisted of the kinds of things that didn't affect any existing code whatsoever.

      I've yet to see C18/C11 code in the wild in any significant capacity. Hell, there's more talk of Rust in the Linux kernel than in C18/C11.

      19 years of C++ just turned it into a further abomination from whence it started (shortly after standardisations started using years).

      Call it something else, so you can at least differentiate between "C++" and "C++ with new knobs on every year". Then you can see if people are actually using those features rather than just saying they're coding "C++" and not specifying a version.

      Seriously, a language that's only viable when you ignore 90% of the language development of the last 20 years isn't one you want to support, maintain, compile or analyse (I think you'd struggle to even write a parser from scratch that could say definitively what version of C++ was required to compile it, iwhtout basiically implementing a full C++21 compiler).

      And, no. I never, ever, ever, ever want to have to "fix" my compiler against bugs nobody else has run into. And I certainly wouldn't put my professional reputation on the line against such a compiler making my code run as intended. Debugging is often a difficult enough task as it is, without having to consider that the compiler might have hit an untested edge case.

      1. Anonymous Coward
        Anonymous Coward

        Re: The old problem of the programmer perisope view of the world..

        > And, no. I never, ever, ever, ever want to have to "fix" my compiler against bugs nobody else has run into.

        So I would guess you have neither the professional experience or technical expertise to either identify or fix those very very obscure application bugs in very very large codebase for shipping products from major ISV that defeated their dev teams for years? Most of these bugs if ever fixed are fixed inadvertently as changes in the code produces just enough change in the compiler codegen path to "fix" the bug.

        Most of these compiler bugs usually turned up in porting codebase to other platforms. Or rather my fixes did. The usual scenario was in testing cross platform parts of the codebases on both platforms I would find a bug on the original codebase and the dev team would say, yeah that happens every now and then but we could never find out why. Even though we had QA on it for a long time. I would go in with a source level debugger, look at the actually asm being executed and more often than not very quickly find a malformed sequence of machine code. A quick edit of the source, look at the code gen output to see if correct, then a comment in the source code, and back to the task at hand. And another mystery bug that had been crashing peoples applications on a semi regular basic for no obvious is fixed.

        On one platform with the toolsets available at the time if templates , exceptions or RTTI was at all involved in the C++ source code, and there was some mystery bug that was unreproduacble by the dev team or QA but crashed customers applications on a regular basic, if the source of the bug could not be tracked down in 15 mins using usual techniques then 95% plus of the time it was a compiler bug. After about 5 years the situation stabilized. But still not great. Then the platform was obsoleted and the replacement toolchian was still buggy after 15 years. Very basic compiler bugs. So bad the vendor eventually abandoned the compiler. But to the people using the compilers there would just be these mystery bugs in their code that were unreproducable and would come and go with different version releases. Sometimes it was just a bug in the application codebase, but quiet often it was the compiler generating junk code. The true bastard bugs.

        I have found it a really good test of just how senior a "senior" guy is on a dev team. Mentioning compiler bugs. If they have never heard of the idea and find the very concept outlandish then you know they really dont have what it takes. The good guys, have either heard of it or are at least intrigued enough to want to know more. Those are the guys I want on my dev teams.

        My job is to get product out the door. I have often inherited code-collapse / code disaster projects. With discipline those products can be turned around and shipped. Part of that is having a very clear headed view of just what does and does not work. And what can end up wasting huge amounts of time. Keep it simple and events will rarely turn around and bite you in the ass. Thats the managements task.

        Looking through the rest of what you wrote, all the stuff you find so important is actually irrelevant in my world. Architecting, implementing and delivering very big end or high performance use applications that often push the limits of the platform. But there again I have only used C++ as just C with classes. Because in my world speed, performance etc is paramount. Not some academic purity. Which is all the C+++ ANSI committees have added over the decades. If you keep it simple it will work. And other people have a sporting chance of actually understanding what you wrote. And the end users machine will not crash mysteriously on a semi regular basis.

        1. Robert Forsyth

          Re: The old problem of the programmer perisope view of the world..

          From my 41 years of limited experience

          Usually when a previous programmer has commented that their hack was to workaround a compiler bug, they have not understood what they were trying to do or the machine could do for them, and the compiler was fine.

          I suppose the compiler should protect the programmer when they use a binary floating point variable as their loop counter and increment by the binary approximation of 0.1.

  21. James 47

    C++ is great

    The best thing about it is that if you don't like it you don't have to use it!

    It pisses me off sometimes, but I still like it.

    1. Ian Joyner Bronze badge

      Re: C++ is great

      "The best thing about it is that if you don't like it you don't have to use it!"

      It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there. More than any other language you have to understand the history of C++.

      C++ is non-orthogonal which means you might end up using a feature in a way you didn't anticipate.

      https://dl.acm.org/doi/pdf/10.5555/1352383.1352420

      1. James 47

        Re: C++ is great

        I'm not going to read that, and I don't know what non-orthogonal means or why it's a problem, but I do know that I've been successfully shipping products with C++ for many years now.

        1. Ian Joyner Bronze badge

          Re: C++ is great

          The problem in computing is programming is really powerful and you can do anything in just about any language, no matter how bad it is. BASIC programmers used to extol how good BASIC was and would not move on.

          You must learn what orthogonality is – it will greatly help build better software.

          Basically orthogonality means inclusion of features that are independent of one another and that don't interact in bad ways. It is the independence of 90º. Non-orthogonality means things interact in surprising and not pleasant ways. Complex languages are more likely to suffer from non-orthogonality. This makes them difficult to predict what will happen.

          http://searchstorage.techtarget.com/definition/orthogonal

          Orthogonality is fundamental to good software design. Mind you, I do sympathise with you, because of lots of academic-speak (the kind of pretentious academic speak I hate and usually the user does not know what it is either) around 'orthogonality' it took me years to understand it – but the definition is simplicity itself (see what I did there!).

      2. Someone Else Silver badge

        Re: C++ is great

        It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there.

        No, you miss the point (and now I finally understand why you whine so much about C++ and revere academic oddities like Eiffel). You don't need to learn what to use, based on some abstruse history lesson. Rather, you need to learn when to use a feature, and by induction, when not to use it.

        It's a different mindset, and one that may not be available to you.

        1. Ian Joyner Bronze badge

          Re: C++ is great

          Someone Else (yet another cowardly anonymous poster)

          >>It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there.<<

          "No, you miss the point (and now I finally understand why you whine so much about C++ and revere academic oddities like Eiffel). You don't need to learn what to use, based on some abstruse history lesson. Rather, you need to learn when to use a feature, and by induction, when not to use it."

          It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that.

          "revere academic oddities like Eiffel" That comment is no misinformed, it does not even deserve a response.

          "It's a different mindset, and one that may not be available to you."

          No, it is a mindset I consciously reject because it is a wrong mindset – one stuck in the mistakes of the past. One that is locked in and unwilling to see that there have been other much better things.

          1. Someone Else Silver badge

            Re: C++ is great

            It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that.

            Complete garbage. I believe I have a firm understanding of C++ (I've been using it successfully and commercially since '92, so were I to not have a firm understanding of it, I doubt that I would have been commercially successful with it). Yes, I do have to know the syntax of the language to use it, as would you of Eiffel, or whatever your chosen expression of perfection might be. But understanding the history of the changes? Bah. I just have to know how to use the language to achieve my goals. And I really don't give a flying fig about how it got there. YMMV, of course, but this way, I get my job done in a timely fashion, and my blood pressure remains manageable.

            Oh, and "self-obsessed"? Puh-LEEZE! You have a lot of damn gall to project your personal opinion on the entirety of the ANSI and ISO standards committees. Methinks you overreach, my friend.

            1. Ian Joyner Bronze badge

              Re: C++ is great

              So I said "It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that."

              Someone Else (SE) responds: "Complete garbage. I believe I have a firm understanding of C++ (I've been using it successfully and commercially since '92,"

              That proves my point SE has been following this for nearly 30 years – about the same time since I was assigned to a large X.500 C++ project. People who have been involved with the language for that long can't understand how they have followed the history and changes, but for someone new to the language trying to understand all that is difficult. That is why I said C and C++ are 'self-obsessed' languages.

              "But understanding the history of the changes? Bah" you just don't get it. You have been doing it for 30 years. Perhaps you understand not to use pointers and rather references or smart pointers or whatever. It is not just a case of syntax (although C++ is peculiarly convoluted and ugly). To understand C++ and why it is the way it is you need to read the history and evolution books, more so than in any other language. And that is for regular programmers, when it should really only be for the language lawyers. In other words C++ exposes all that when other languages abstracts that working away.

              "Oh, and "self-obsessed"? Puh-LEEZE! You have a lot of damn gall to project your personal opinion on the entirety of the ANSI and ISO standards committees. Methinks you overreach, my friend."

              OK, I'll address that again. Complexity becomes self obsessed. Complex technologies get lock in. Note how obsesses C and C++ people are, always leaping to defence whenever anyone criticise them. There is an obsession around C and C++ that is almost cult like – day anything that is wrong and abuse those who point them out.

              I've been involved in ISO myself and there is a lot of posturing that goes on. The observations are not overreach. But there are probably some good minds trying to fix C++ and get it right. But it still is not in that position in 2020 and they still have things left out to fix in 2023. And those are things that have been around for 30 years. The problem is trying to kludge them into a language that was not suitable in the first place.

  22. Mike 125

    Babel

    Languages.

    Different hardware architectures => assembly.

    Thin layer abstracting most of those differences => C.

    Thinnish layer abstracting common OS features => probably Rust, Go etc., but nothing 'unsafe'.

    Everything above => whatever maps best to the requirement. And that's the hard part.

    But it's not C++.

    As always, horses for courses.

  23. jake Silver badge

    @Ian Joyner

    Quite frankly, you are coming off as a jaded, disillusioned academic.

    In light of that, instead of writing reams about how bad the programming languages that are in use today, producing code that people used productively[0] are ... would you care to tell us all what language(s) we should be using?

    I, for one, await your answer with bated[1] breath ...

    [0] For massively variable values of "productively" ...

    [1] I'd have used "baited", but I'm not trolling regardless of appearance.

    1. Ian Joyner Bronze badge

      Re: @Ian Joyner

      "Quite frankly, you are coming off as a jaded, disillusioned academic."

      You descend to ad hominem. I just wrote this in another answer: my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.

      So you can take your false assessments and you know what to do with them.

      As for giving you (Jake, whoever you are) examples – well you can do your own research. C++ people just bash everything.

      1. jake Silver badge

        Re: @Ian Joyner

        That's not ad-hom, it's an observation based on available evidence.

        Whoever I am, I am not Jake. That's one of a couple other ElReg commentards. Kindly keep them out of it. Ta. (Or does your mythical "perfect" programming language ignore CaSe?)

        So you're not going to allow us into your world, where the best language is obvious? Somehow I'm not surprised.

        Nowhere have I so much as hinted that I am one of the "C++ people", quite the opposite in fact. Kindly retract that heinous jibe.

        1. Ian Joyner Bronze badge

          Re: @Ian Joyner

          Jake: "That's not ad-hom, it's an observation based on available evidence."

          Oh, yes it is, you said:

          "Quite frankly, you are coming off as a jaded, disillusioned academic."

          You used the word 'you' referring to me and then made a remark, which is just wrong. Instead of addressing the subject which is C++, you attacked the person. That is ad hominem.

          Jake: "Whoever I am, I am not Jake."

          So you don't understand (or hopefully you do now if you read my last post) that system programmers should be hiding the details of the platform so that applications programmers don't all have to deal with those details (that is the aim of software at all levels), and yet you want to hide your own identity.

          I can only think you have come here to troll, and this discussion has certainly declined into ranting.

          1. jake Silver badge

            Re: @Ian Joyner

            "You used the word 'you' referring to me and then made a remark"

            The remark was modified by the phrase "coming off as". Don't be disingenuous.

            "I can only think you have come here to troll, and this discussion has certainly declined into ranting."

            Projection is an ugly thing.

            I'm still not Jake. Jake is one of at least two other entities here on ElReg.

            1. Ian Joyner Bronze badge

              Re: @Ian Joyner

              "I'm still not Jake. Jake is one of at least two other entities here on ElReg."

              Well, don't post as Jake.

              1. Anonymous Coward
                Anonymous Coward

                Re: @Ian Joyner

                I'll put you out of your misery. He obviously isn't posting as Jake. He's posting as jake.

  24. swm

    All languages have their good and bad points. Generally people favor languages that they are familiar with:

    LISP - lexical closures, precise garbage collection, can mimic most other languages

    JAVA - type erasure, precise garbage collection

    C - "machine language" with better syntax

    C++ - C + separation of algorithms and containers through iterators

    HASKELL/ML - precise garbage collection, list comprehension

    Your Favorite Language - stuff, more stuff

    What C++ needs is a precise garbage collector although careful coding using values instead of references can eliminate most memory problems.

    YMMV

    1. Anonymous Coward
      Anonymous Coward

      PYTHON - makes you even sexier than Errol Brown covered in baby oil.

  25. Doctor Syntax Silver badge

    "'Best approximation of C++ ideals so far,' ... but is it too big and complex?"

    I thought too big and complex were the ideals.

    1. Ian Joyner Bronze badge

      Doctor Syntax:

      ""'Best approximation of C++ ideals so far,' ... but is it too big and complex?"

      I thought too big and complex were the ideals."

      No, sophistication is the idea – and sophistication results in simplicity. It is the complexity of the problems we should handle, not the complexity of the tools. Problem complexity is essential complexity – complexity in tools is accidental and self-inflicted complexity. That is the problem with C++.

  26. Someone Else Silver badge
    Gimp

    Back to the Future!

    There are many other new features, including the “spaceship operator” <=> which does a three-way comparison and returns less, equal or more.

    It took over 50 years, but C++ finally came up with a way to implement the equivalent of FORTRAN II's (in)famous 3-position 'if' statement.

    For those of you who are not retired, and have therefore never had the pleasure, the FORTRAN II 3-position 'if" looks something like this:

    IF (expr) 10, 20, 30

    where 'expr' is evaluated, and if it is less than 0, control jumps (as if via a GOTO) to the labeled statement represented by the first entry in the comma'd list (statement 10, in this case). If 'expr' is exactly equal to 0, control jumps to the second labeled statement (20, here), and if greater than 0, the 3rd label is jumped to (30, here).

    What fun!

  27. Throatwarbler Mangrove Silver badge
    Trollface

    The great thing about The Register's comment section is that it's possible to find an obsessive crank for literally any topic.

    1. Dan 55 Silver badge

      I'm so looking forward to discussing C++23 in three years.

    2. DrBobK

      I'd upvote this 100 times if I could. So many loonies. They may know tons of stuff about programming, but still clearly loonies.

  28. This post has been deleted by its author

  29. diego

    I don't get all the hate for C++. It's not a good application language for most applications, but it's very good for applications that need to run REALLY fast while doing complex stuff. If that's not your use case: pick another language. It's as simple as that. I've written UEFI code that was beautiful C, and compiled to very small sizes. I've written desktop applications in Java and C#, for which C would have been a pain in the ass.

    I'm currently in a team writing new features for graphics processing software that uses C++ 14 after 10 years of having written C++ code, and I was surprised by how comfortable the language has become. It is definitely evolving.

    The right tool for the right job should be the mantra for every programmer.

    1. Ian Joyner Bronze badge

      "I don't get all the hate for C++"

      You don't get it because it is not hate. All programming languages are technical artefacts and should be analysed for what they are good for, where they are a match to some problem domain more than others. But they should also be criticised for their cross-domain flaws.

      What computer people should aim for is simplicity. C++ is the antithesis of that. That is not hate on the part of people pointing that out – the problem IS WITH C++.

      "If that's not your use case: pick another language. It's as simple as that"

      No it's not as 'simple as that'. Many are stuck with having to suffer C++ because of its flaws. Some can see the flaws – others just don't know that is why they are having a hard time programming.

      "It is definitely evolving."

      Evolving towards what other languages were 30 years ago. Still contracts are put off until C++23. Many people who know other languages roll their eyes and say 'C++ finally got that'. But then they look at it and the C++ way is so much more obscure and then you find it interacts badly with other parts of the language in non-orthogonal ways.

      "The right tool for the right job should be the mantra for every programmer." The C and C++ world is full of these false mantras. Yes, there is some truth in that. but in contrast C++ tries to be the tool for all jobs and in that becomes overly complex. The right tools are simple and sophisticated. C++ takes the unsophisticated and complex path.

      1. Anonymous Coward
        Anonymous Coward

        Mate, it's your students I feel sorry for, not you. Imagine learning nonsense like this for four years then finishing university, starting your first IT job, and being hit square in the face by the real world.

        1. GrumpenKraut
          Happy

          Where I work he would be the target of very many ballistic tomatoes should he spew nonsense like his comments here to students. He would not be able to finish one single lesson and lesson two would be done by someone else.

  30. AlexO

    Why make it Java?

    Literally not a single feature makes c++ easier to read , easier to program or improve performance(main reason people use it), since c++ 11. It seems they are trying to turn the beautiful language into ugly Java ( which already exists and it does what it does) , I cannot figure out why . Bjarne created a beautiful thing 30 years ago , is he ( and others) trying to make sure it does not outlive himself ?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like