Re: The old problem of the programmer perisope view of the world..
"> Multiple inheritence is too scary ...
One. Do you actually know how M"
I'm not sure who 'Anonymous Coward' is addressing here. However, the fundamental thing wrong in this answer is the ad hominem attack on 'academia'. This is a problem in C think that 'oh we deal in the real world, those academics only give small examples'.
That is because academics see a problem and boil it down to a small example that demonstrates the problem, so it is easily understood. They should not feel that they need to apologise for that. In fact, academics have considered programming in the wide and wild, far wider than most real-world practitioners. So I am very against this faulty C thinking that 'oh things are very nice in academia, but we know what really goes on'. That is garbage thinking.
"So I suggest you go back to your academic textbook MI examples. Because that is the only place you will ever see them."
Textbooks also boil it down to small examples. Orthogonality means that many elements can be combined. Non-orthogonality means things combine in surprising ways – I don't mean with pleasant outcomes, quite the contrary. Now these 'real-world' practitioners might think that non-orthogonality is a necessity in their 'real world', but it is not.
In fact, saying there is this difference between 'real world' and academia is wrong. Programming is not in the real world – it is in the virtual world of electronic circuits. Virtual worlds are what software is about. Software does not deal with physical things – that is why it is really powerful and easy to change and flexible.
Those 'real world' difficulties and complexity are because someone did not understand this along the way, did not guard against non-orthogonality.
"So how did you learn your "real" OOP skills? From some TA who never wrote a line of shipped code?"
Again I don't know who 'AC' is addressing the remarks to, but this again is an example of the false attitude. As an example and general refutation, my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.
Alan Kay noted, when he coined the term 'OO' he did not have C++ in mind. So accusing others who criticise C++ as just being taught by some 'TA' is nonsense.
"Because in my experience those are usually the only people who think stuff like MI is either important or relevant. Those of us in the the trenches shipping product most certainly don't."
You don't understand MI, or have seen it badly applied (it is easy to abuse MI). But MI allows you to break a system down into even smaller classes and then to recombine those abstract classes in very flexible ways. That means a concept is not buried in some other larger class that you then have to repeat that code somewhere else.
As for 'trenches' – again, that attitude coming up because you are working within badly designed systems.