My thoughs on this subject.
Ok, ignoring the person that said the author had going off in a bit of a tangent.
Having done a modern computer degree... I might have an though on this subject.
I did first year around 11 years ago part-time, had to quit because I changed jobs. Came back into 2nd year and completed my degree a few years ago.
Now not all universitys will be the same, and even in a given insitution folk will have there own oppions. However let me make this clear about my University.
1. In first year you learn Hello World, and pseudo-code. You also learn things like RAD (rapid application development), where often you will create things using off the shelf products like Access, and combine the basics of databases and programing to create a record collection. Messing around with Access (or similar) is a mear practical application of the theory they are filling your head with. Most first years will include hardware courses telling you how the memory connects, what a cpu cycle is and so on.
2. From second year your expected to know the fundamentals of programing and they begin to explain 2 very important things. The first is Software Developent, that being client/server relation ships, depending on your course you may do HCI (human computer interfaces) modules, networking, web development and so on and so forth. Needless to say, if you are being trained as a programmer you will do more SD than everybody else and other important modules on how managing errors and other such stuff. (don't ask me I did networking).
The second important thing they teach you, and this is really from day one, is programming is not knowing a language, teaching sombody a language to program in is a dead end as languages change much like jobs, and being to focused on one languages leave your skills sadly difficult to transfer. They teach the theory of programing, how to figure out and break down all the pesky challenges that will face you... they then grab the programing language of the day and throw you in.
As it happens back when I did my first year (knowing programing languages such as Pascal and Cobol were entry requirements), C++ was flavour of the day, however when I returned it was Java. Now I heard whining about people using graphic interfaces to program in java. I for one and I only completed my degree a few years back now (and I happen to know its still the same) did all my programing in notepad, or a notepad like programing enviroment.
Do I think Java is the best language to learn doing a degree?, well if it was the only one I learned or knew.... maybe not. HOWEVER nobody I know comes out of a degree knowing only ONE language, all are exposed to Web, Database, Scripts and all sorts of other stuff. I mean... I learned Perl...
I find Java a good language once you have know one or two others. Simply because you can teach advanced and complex things in Java that would take you all week just to program in C or C++. Remember the code is not important, knowing how to program is not the code.
Thus having done my round of programing in Java and messing around with arrays, Swing and other stuff I moved on to more network orientated work. However (I dunno maybe as a joke) on the network modules list was a 3D course..... to my horror (I am not a fan of programing and can be considered reluctant) I had to learn to program 3D stuff in Java. Now in hind sight I thing it was a good thing to include, but at the time it was murder, as programing is like riding a bike, you never forget but that does not mean you will not fall off a few times anyway.
I have now started jibbering, in summary at uni I was taught how to program, I was not 'taught' a language. Passing the degree was considered proof that I was capable of programing, and it happened to teach a language (or four) that might be useful to an employer. However as it has alwas been, Graduates are taught theory, employers are meant to take a given quality of person and theory and 'show them the ropes'.
All the Software Engineers that I still know from my course do what I would consider 'real' programing. One mainly does Java Scripting for a large oil multinational, and the other whist Java Certifyed from Sun (did a placement there), has gone on to program some (from what I hear) horrific in house language. My pal doing the java scripting did infact say to me the other day that when he came out of uni the though he 'know most of java script', now over two years later of using it every day, he thinks 'maybe' he knows most of what you can do with it. (for the record he does do more than just java script).
Now I took networking, and I have had to program in VBA.... yuk, but still even having never learned VB or any other basic program, I managed to get the job done. Now whilst I hope to never have to poke my head into 'proper' programing again, I do use my knolage of programing and apply it to other things, such as Routers and phone systems.
All in all I do not think that you can make a general statement about the teaching of programing in Computer Degrees, other than to say its a little diffrent than yester year because everything is a bit diffrent from yester year.
Anyway, I am sorry for going on so long, I am sorry that I was to sleepy to spell check this post... I know my spelling sucks. Most of all I am sorry for not proof reading this before posting, as I am sure there are some awsome gaffs.
However of the posts I read before puting fingers to keyboard, I agreed with these:
Bollocks - By Greg
Snobs, bigots, and venal professors - By Morely Dotes
And at:
@Greg - By Joe M
The main market for programers is currently in higher level launguages in the UK, low level programmers whilst in demand (because there is not all that many of them), are mostly trained on an electronics degree, where they must master both the hardware (creation) and the control software.
Thus, An Electronics Degree student may create a device and produce low level drivers, whist a Computing Degree student will design a program which requires a code monkey to create some libarys to allow it to be pluged in to an already existing web application, which sombody who has done a Web Development Degree created.
The IT feild has grown very large, even since I first joined it a decade or so ago. With so much to learn it is no supprise that there are now specialisations. Low level programing is such a specilisation. If you want sombody with both awsome program design skills and machine code abilitys, you will need to find sombody with one of the skills already and train them in the other.
Such is life.
-Ano