Reply to post: One problem with the article..its mostly fanstasy..

Any fool can write a language: It takes compilers to save the world

Anonymous Coward
Anonymous Coward

One problem with the article..its mostly fanstasy..

Just how familiar is the author with C compilers in the 1980's? Did he use any of them? Ship any product with them? Was he even alive at the time? Because his description bears not the slightest resemblance to the world of C software development for commercial shrink-warp software back then.

Shipped my first C commercial mass market application in 1984. Compiled on a custom in-house C compiler. K&R compilers are easy to roll you own. After that shipped lots of products over the next decade written in C until C++ compiler became stable enough around 1992/93. The C compiler quality was variable. The Green Hills C compiler than shipped with MPW 1.0 in 1986 is still the best codegen output I've seen. Beautiful output. The Watcom C compiler which was the (non MS) x86 industry standard back then was also very good. Its open source now. So you see how to write a great small fast C/C++ compiler.Watcom C /C++ was put out of business by MS spending large amounts of money to "buy" their best customers. Long story. Sad ending.

So by the late 1980's it was Watcom or Borland for very good C compilers in x86 land. Green Hill's for superlative (but expensive ones). On the Mac there was the OK compiler shipped with MPW 2/3, the very good Think C one which was the defacto standard in Mac-land. And a whole bunch of honorable also-rans.

Microsoft C chugged along in MS-DOS land but only used when no other choice. Very serious linker bugs for large sym-tables. Hence MFC. And no one in my expereince used GCC 1 or 2 for mass marker software development because it was garbage. Just never came up as a platform. First time I saw GCC in a commercial dev project was for an embedded system. In 2001. And GCC was eventually dumped because it was so buggy. After GCC 3 was released it was not longer terrible but still usually only used when no other alternative. Or when people did not know any better.

LLVM got such a fast takeup because GCC was/is so terrible and LLVM is easy to port to new platforms. After 15 mins of reading the GCC source code you just want to start hitting your head with a hammer. LLVM source code is not so toxic.

The dirty little secret of usable compilers that produce the best code output is that their internals bear little resemblance to what you see in those academic compiler books. Even the optimizers. I can think of two compilers that have unusable optinmization features (important ones like instruction scheduling) because they allowed some grad student to try to implement their thesis. Same goes for front ends. I just finished up a IR retargetable codegen. Very small, very fast. And pretty much the same code as the roll your own compilers back in the 1980's. Because back then there was not a large body of academic literature than tried to make what was very straight forward look very complex.

Thats how it actually was in the trenches back then. For those of us who had to ship product.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon