Mostly well-written
But when a programmer has some consideration of ethics is that really "trash?" I don't think so.
Recently a director at a huge bank asked me “Do British students learn algorithms?” At first I thought he was joking, but even though he was paying three times what the average new grad gets paid, he felt despair. Because of similar experiences I was surprised to read that only 17 per cent of CompSci grads from last year haven’ …
You're quite right. It's a decent article but the comment "and other nonsense where you write essays rather than think" is just stupid.
Why? Because if there's one thing I have consistently found it's that CS grads with no social skills are very limited in their choice of profession. Those who can be let loose on customers win over every time.
Yes, there was a point to going out, getting drunk and getting laid at University ... in fact, for CS students there was an exponentially greater benefit than for those who were naturally more sociable anyway.
I've been programming since I was 14, started on a PDP 11 with paper cards, so it was a while ago :) I've done many flavours of Assembler, Cobol, Fortran, C++, Java Etc.Everything from smart cards to mainframes. But new grads know at most one or two languages and have no clue how a computer actually works. It's a magic box to them. They also don't know how to communicate, neither in writing nor verbally.
Despite my many years of programming experience I spend less than 15% of my time programming, most of my time is spent interacting with customers, giving talks at events etc. And that ability has put me in the top 2% of income earners. Comp Sci students definitely need to learn more hardcore, in-depth programming skills, but they also need to learn how to communicate what they know to others.
This is unbelievably accurate. As a senior programmer for a big mobile house, I've been doing recruiting for graduates to take on as junior iOS programmers.
I used to teach CS in the USA - studied it there too. My intro language was C. When I started getting CVs through here, I was surprised to find a lot of people applying for iPhone positions without any C/C++ education. Since Obj-C on iOS lacks garbage collection, having some experience of memory management is really rather important, and yet it's a skill completely lacking from many coming out of UK CS courses.
I have singularly failed to find a single graduate student who formally studied C or C++ for their bachelors. I'm honestly surprised *more* CS graduates aren't unemployed. Plenty of them deserve to be.
Yep, I don't doubt there is a part of Winebagos that needs programming. The EMS..
Any how, TBH, when I did my Comp Sci degree, we started off learning Modula 2. Why? Because according to my lecturer, it's a good language to learn the basic structures and thought processes involved in programming. It's also a language that isn't used in Industry, so we'd be forced to learn something more useful. I think the idea was that if you know one language, it's easier to learn a second. If you know two, it's easier to learn a third and so on.
2nd year, we learned C++, eventually going on to use the MFC to program simple Windows apps. We also had to knock up simple apps using Gnome on Solaris. In C++. In fact, the bulk of my programming throughout my degree was in C++.
The students now use all sorts of essentially scripting languages, the hardest of which seems to be Java...
eh? I think you were thinking of "memory management". "Garbage collection" should certainly not be alien when using Java.. it's handled for you, certainly, but you still configure it with VM tuning and should at least be aware of it, and if you've never run into memory leaks due to weak references or similar pitfalls of garbage collection mechanisms then you're probably not using Java as much as you thought you do.
The languages don't matter. I studied at a top ten university, graduating seven years ago and we didn't do a single class designed to teach a language. From memory, there was:
• principles of programming; an introductory course with submissions in Scheme but the point being to understand standard programming constructs
• computer architectures; involved an invented stack-based assembly for the practicals, but a written exam on instruction set architectures and design was the main part of the course
• computer graphics and visualisation; rasterisation algorithms mainly, with the necessary toe dipping into light number theory — coursework submission required in C and OpenGL
• algorithms and data structures; big O issues mainly, work submitted in Ada95
etc, etc. And, as a Maths & CS student, I spent only half my time in the CS department. I don't think we touched Java at any point, but I don't think anybody in the department there would consider that a failing. Though I guess they might, for example, shuffle Ada95 out for Java if general shift in tool chains made it more reasonable.
This does, inevitably, make for graduates that finish still a few months short of being able to be fully up to speed on any specific job if they've not done anything for themselves outside of the course, but it's much better than having explicit language courses, even though it allows for cheap shots that there are no universities teaching languages other than Java. That's possibly true, but it doesn't mean that all universities teach is Java.
I'm guessing you studied at the University of York, which in the last year or two has since switched nearly exclusively to Java. It's rather saddening as, while bit of a struggle to learn a new programming language with each module, it was rather insightful to see how things can be done in different ways.
Yep, York it was. And I'm an AC just per the standard rules of healthy separation between online and real life. I quite liked the place and, at least in its early 2000s incarnation, was impressed by the academic approach taken. Not always so keen on my colleagues though; on one of the whiteboards on the way into the department at some point someone had written one of those mathematical 'proofs' that 1 equals zero based on fatuously factoring out a multiplication by 0. Someone else had written 'this is a divide by 0, the result is undefined' or something like it by the side and a third person had cleverly added 'this is computer science, not maths'. Hmmm. Still, one of the big advantages of a campus university is that you end up socialising a lot outside of your department...
Really sad to hear that they've decided to give implicit approval to a concrete technology over just teaching the principles, especially as they're are still 6th in the country for Computer Science and 17th in Europe in general, per the latest lists that The Times will furnish. I'd still be surprised if they're churning out that many graduates that can't pick up C++ in a week. Compiler construction was still a popular item on the roster for second year students in my day — that and the emphasis on algorithms and how things work at the machine level elsewhere can't leave people too stranded, can it? And I was under the impression that there was quite a lot of actual hardware engineering stuff for solely Computer Science students? I definitely had to help someone with z80 assembler at some point, having covered it at A Level.
Sorry, my first language was 6502 assembler and a couple of different flavors of basic.
Add to this Fortran, Cobol, C, all in college. (Ok, this was before Java and C++ was still relatively new).
But my point was that the language didn't matter. (There's a class for that.)
Java, Objective-C all came later.
Yeah I do know C++, which is why I can make the statement that C++ blows. Sure its my opinion, and there are others that would disagree with me.
The key is that I can defend my opinion. Any decent programmer should be able to defend their opinion. When you interview for a developer, you should ask questions based on their stated experience that required a detailed response. Not only will it show their technical expertise, but also their communication skills which are also just as important.
PS. Sorry I dislike C++ because as a consultant, I'm called in to fix projects have gone wrong. Cleaning up bad C++ is a hell of its own... ;-)
As a young teenager in the 80s I started playing first with a commodore PET at school and then my own BBC Micro. So I picked up BASIC. My 'O' level was written in 6502 machine code. My 'A' level was written in BASIC. They had taught us Pascal, which was an option, but I was pissed at them for not letting me use machine code for the 'A' level.
I would give Pascal another chance a few years later.
Even doing the 'A' level I started to become disillusioned with what we were being taught, so when I saw the degree syllabus I decided I'd had enough of this education rubbish. I still remember the computer science teacher calling me out of a geography lesson to help him set things up when I was 13!
From there the big old world of work started. COBOL, PCs, 8086 machine code, DOS TSRs, windoze, bit of OS2, VB, Delphi, C etc. Intarweb and TCP/IP.
Delphi (Pascal++) is probably my favourite for writing windows apps. It's just so quick. How C became some dominant when it was designed to make thing unreadable after you turn away for 5 minutes is beyond me.
I never ceased to be amazed at the number of grads I worked with who just had no fundamental knowledge of computers. Sure I could forgive them for not having built a computer from spare bits aged 15, not everyone was a geeky as me, but to looked astonished when I open a command prompt and type a few magical incantations is unforgivable.
Typical example - "I have a load of music mp3s, mpeg movies and word docs all together in a directory, how can I move just the tunes and movies?"
Errr, move *.m* <destination>
Reclaim all the drive space windows has cluttered up with temporary files?
del "%temp%\*.*" /s /f /q
Simples.
These days I program windows apps and microcontrollers. It's interesting juggling between Delphi windows apps and C/ASM on the MCU, ad yes, fixing other people's C code.
There is always a sigh and relaxed smile on my face when I return to Delphi and have native string handling once again! Not to mention the inherent security aspect of having a language which knows how to handle strings, and isn't just forcing you to throw random characters about in memory!
I'm gay and a geek, so I'm doubly F....
But back to the subject of the article, it definitely echoes my experience of Comp Sci Graddies. They think they know all there is about IT, but sadly the truth is very different. It is rare to find one who can roll straight into the job without having to be hand fed for the first six months.
Fail, because Universities appear to be doing just that.
I'm not for a moment suggesting that universities are not failing but having to train up new graduates for six minths before they are ready to start work is not in itself an indication of that - it is not the purpose of university to mould students to the needs of a particular business.
I found Java the dullest part of my CompSci course 10 years ago - we used Haskell, Pascal, Java, C++ and Matlab, and spent a good deal of time on algorithms, operating systems (minix - written in C), hardware architecture, and concurrency.
And now I code in C# most of the time - but when I need to use other languages, I can and do, and my understanding of the computer as a whole stack (hardware/OS/software) gives me an advantage in writing better software than people who just seemed to just spend 3 years learning Java, and no "computer science" whatsoever.
It is after all the most sought after skill by employers. But if all they've learned is Java then they are only one step above useless. They need a proper grounding in the basics of computer hardware, OSes and algorithms. And you can't properly do that without at least some C, hopefully some C++ and maybe even assembler skills as well.
"Computer Science" education of today is hardly any better than a mail order "learn to program in 90 days" course.
Early on in during my degree I was told that we should trust the OS and write for that. That being Windows 3 and the course being real-time systems engineering design. Having been a mature student with a couple of years doing stuff like reverse engineering control system code, that depressed me somewhat. So did having to learn Z, but I appreciate that more now. Unis seem to churn out people that think they can code, but often no idea about why they're doing it and how it relates to the business they're in.
I did my secondary schooling in the late 1990s, and was so utterly put off IT by my teachers (and by the woeful syllabus they had to teach against) that I didn't actually do my CS degree until five years later, having basically dropped off the map during that time.
The first year of my CS degree was similarly dismal; being taught what integers were, and how to perform boolean evaluations, and the like.
There seems to be a (not inaccurate) assumption within the university system that, unlike any other subject I know of (except for Art, which requires a foundation year), their first-year students will know NOTHING AT ALL about their chosen subject.
You wouldn't just walk into a university one day and ask if they had any places available on a Chemistry degree course, and yet that is exactly what I did when starting my CS degree. I just walked in, they checked for empty places, and signed me up on the spot.
When I asked what I'd need to know before starting, they explained to me that there wasn't any requirement beyond basic reading and writing skills (and some UCAS points, I suppose). This is a problem that starts in schools at the (utterly, utterly woeful) ICT GCSE level. It wastes university resources, and valuable time that students should be spending at the end of their degrees on advanced subjects.
Sounds the same experience I had in the late 80's
All universities said "no prior knowledge" required, and at one university, the person I went to see said it was actually better to have no prior qualifications.
Seeing as I had an A at O level, then got an A at A level, and a 1(distinction) at S level, I had no intention on doing a course with people with no knowledge, so I fell back to my second love, electronic engineering
I still went into the computer industry, and was employed on my knowledge rather than qualifications.
After a while we developed a "unix test" for future employees. It wasn't hard - any seasoned unix programmer / systems guy should have easily got 100%, but some of these so called experts were getting 20% or less
This is exactly right. I agree almost entirely. There is no merit to any CS grad if all they know is one language and no operating system internals. Java is OK, but then so is C#. It should be mandatory to study some part of an operating system, and here as LINUX is opensource and coded in C, it make sense to have C. Let them learn how pointers work. Defensive programming. Database work at the lowest level.
Has no-one understood why all the best CS grads in well paid jobs were not educated in English Universities?
Of course, this starts at school. Kids are crammed to get good A levels. Just this weekend my duaghter had a friend staying over from a good English university reading Maths. She is in her first term. She said the course is hard work - "They don;t teach us everything, we are expected to find things out ourselves and have to use the library". She was almost euqally horrified when I suggest that research is not looking things up on Wikipedia.
Personally, I don't care what languages who have or what DBs you have used. But you must have more than one language and know when to use different ones; and at least have good SQL. Know that and I can teach you how our shop works. But without that you are a drone.
We used to have an education system that was the envy of the world. Where did it all go wrong?
Adrian (AC so I don't totally hack off my team during milk rounds)
"We used to have an education system that was the envy of the world. Where did it all go wrong?"
Politicians, politicians and politicians.
Most British are uninterested in politics (and who can blame them) so the ones we have are basically the lowest common denominator elected by those who don't understand and don't care.
So we're well on our way to hell in a handcart.
Unless sufficient remember that rebellion is the right of every citizen - and sometimes a moral duty.
<quote>Personally, I don't care what languages who have or what DBs you have used. But you must have more than one language and know when to use different ones; and at least have good SQL. Know that and I can teach you how our shop works. But without that you are a drone.<unquote>
What barbarous nonsense. Ted Codd studied maths and knew nothing of SQL when he invented the relational model for databases without which SQL would never have never existed. I was a lot younger than Ted, but was doing research in databases and information retrieval long before SQL was invented. SQL is just another language (actally a badly screwed up version of Ted Codd's idea for a relational calculus based language) and saying that someone who doesn't know that particular language is a "drone" for that reason is total garbage (and I say this as someone whose last two technical director/VP level jobs were based partly on my SQL expertise, not as someone who wants to claim SQL doesn't matter because they don't know it).
<quote>We used to have an education system that was the envy of the world. Where did it all go wrong?<unquote>
When they began to let idiots who think some particular computer language is an essential part of CS eductaion have some influence? (Unfortunately that really has happened, and at a large number of Universties that language is Basic, and - o tempora, o mores - at an even larger number it is Java; but the real bad news is that it's yet more often C++.)
What I have noticed a lot of, is the people who are working in IT are no longer Geeks, in the true sense of the word.
I hear a lot of: "I'm a programmer, I don't care about computers.". Just the other day our SQL DBA called me a geek (not that I minded), just because I have a NAS at home and multiple laptops and computers in the house. When I first started out (early 90's professionally), only geeks worked in IT - now it seems everybody does, but don't actually have any passion for it (apart from us old timers)..
There are several hundred times more people employed in the computing industry than in 1990's so it's not surprising that there are proportionally less geeks but I doubt there are less total geeks. Probably more concentrated in the challenging jobs such as assembly programmers (life expectancy 30 years, sanity expectancy 3 years) and those nutty hardware guys (just guessing).
The number of bad IT staffers who are not diverse in their chosen career path is a limiting factor. Oftentimes having knowledge of C/C++ has enabled me to try things with Perl/Java/Javascript that most others wouldnt even think of. Knowledge of a Good OS (Unix) has allowed me to understand the failures/workarounds of others (MS/BeOS/OS-2).
Sadly, since things have moved to a 'mobile' world (iOS/Android/web-based tools), there is a belief that there is no longer a need for high-level languages/OSes. WRONG! I call this the 'University of Phoenix' approach to education: Teach the 'Hot' tool and get them edumahcated.
Epic FAIL becuase students have become as lazy as the so-called institutions that are teaching them.
Same thing over here. When I mentioned that I had bought a used Cisco Catalyst 2950 switch, a lot of former college friends asked me why would I want something like that. I responded that it was so I could separate my home network into 3 VLANs. It brought another ton of "why?" questions.
It really, really seems that "because I want to tinker with this stuff" is no longer accepted even withing CompSci grads. :(
Funny thing: When I told my Mechanical Engineering student peers that I wanted a few VLANS (Bought that cheesy Netgear managed gbit switch,) they were halfway interested.
More comedy: At my university the CS students had to ask an EE or ME student to fix their computer when it broke. Their interest in computers ended at the window border of their IDE.
Even more: Thanks to a mid-year shift in the standard syllabus, my class of ME students learned 2 languages in the first year (C and Matlab,) while CS students only learned Java.
Also: ME and EE students were required to take an intro to Linux course. CS students were not.
The Best Part: With my ME degree and barely out of university, I've now got a programming job (Long story) and I interview CS grads for non-programming positions. And they all suck. 1/3 of a page (Out of at least 3!) of your resume should not be a bulleted list of every version of Windows you've ever used, especially when you're applying for a Linux-only position. And spelling and grammar do count, though I've had to throughly give up on judging by resume layout, since they're all pretty much based on the same tragic MS Word template.
It was intended in a more-or-less friendly way, at least I choose to interpret it as such.
I was a mere deployment tech (put PC on desk, turn on, copy user's data, remove old PC). But my boss tended to sneak me out to do the curlier support tasks as she had no faith in most of support to fix more than someone's word file. Most IT people today seem to treat a PC as a magical box. They have rote-learned some arcane gestures to manipulate the icons on the magic window to keep the godlet in the box mostly happy but have no real knowledge of how it really works.
I'm old-skool - I started with digital electronics after dropping out of a completely pointless senior high school experience back in the 80's. Ican (and have) build an 8- or 16-bit computer from chips up. Can pick up the basics of any language in a few days (interestingly it took exposure to Java back in the 90's to get my head around object-oriented before I could finally manage C++). A few years ago I spent a boring lunch break drawing a pure-logic router circuit on the whiteboard while doing a Networking course at trade college. It became very obvious why all the masks are inverted - significant saving in transistors, which back when these protocols were created was important.
Most of my knowledge is not formal and so not attached to a piece of paper, so finding a job I liked was a bit of a slog, but by starting at the entry level and showing what I had, I think I am finally there now - I now get to help Digital Media Creative Arts students make electronics/microcontroller-based art, amongst many other things - heaven.
My experience of CompSci was one of thoroughly awful lecturers. While there were a few truly decent ones at my Uni, it was the mediocre to downright appalling ones that were teaching some of the hardest subjects on the course. I've no doubt that these guys were leaders in their particular research field, but generally they couldn't lecture themselves out of a paper bag.
At least arts lecturers get into the field expecting to be teaching, or at least addressing a crowd. CS lecturers however seem to have no social skills and often are brought in from various corners of the world and as such have either very poor or heavily accented English.
The fact is though, if you have decent social skills, an engaging personality and are knowledgeable in the subject then you have far better employment opportunities than as a University lecturer. Most of the lecturers I found are either the end of career types, whose understanding of the basic fundamentals of computing are excellent but really lack any modern experience, those who lack the personality to make it in the job market and those who just want to do research, to whom lecturing is something you're forced to do rather than any kind of calling.
Why do they get the post grad students to teach?
I had to take Z80 assembler in my first year of IT at Leicester Poly (in the days when IT consisted of analogue and digital communications, analogue and digital electronics, assembler and C programming, plus the wolley social implications stuff). I had been writing games and the like from scratch on the ZX81 and Spectrum for a few years whilst I was at school, so I knew Z80 assembler inside out. (And I had blown a few up interfacing to them!)
The lecturer was a post grad student who was reading from a book. About half an hour into the first lecture, after I pointed out for the third time some glaring mistake he had made, he told me to f**k off and not bother him again. Others on the course, passed the assignment questions to me through the year and I passed back my answers via the course leader (so they could not get lost). I reccon that I spent more time teaching that course to the students in the pub than the lecturer ever did. And the bast*rd only gave me 99% mark for the year, dropping me a mark for not attending the lectures!
There were a couple of other post grads who also had little idea about teaching, but at least they had more than one old guy who was only emplyed as had a large research grant in tow!
The best lecturers all had a passion for the subject.
Now, when I interview applicants, I get them to write (with a pen and paper) about their journey to the interview. If they can't write a coherent, legible description, then I'm not even going to waste my time on them. I can teach them a programming language, but they need to show an aptitude in their own native larguage first!
So, bad lecturers and bad students. Who is suprised by the unemployment! Not me!
All of my database related courses were taught by post-grad students, who were also overseas postgrad students!
The fundamentals of database design and architecture, passed on by someone who didn't seem to understand the subject that well, couldn't express himself very well, due to limited English, and was unable to answer questions to any degree, also due to struggling to understand.
Object oriented software engineering, taught by someone who completely failed to get across any concepts of object orientation. Later an 'old fashioned lecturer' went off on a tangent, in a completely different subject, about data types and built it up from simple data types to the concept behind object orientation, and how classes work making it all fall into place.
We had "tutors" in the programming labs to go to with any issues, but most of them lacked even the most basic understanding of debugging, often, even how to sort out compilation errors. The basic skills and understandings were overlooked in selecting most of the people who were there to teach and support.
I now work with a lot of people who did computer science at Uni, most know how to program, but not a one really knows how to design and construct an application. They generally learned by looking at the existing code and applications and duplicating it, cut & paste coding, tweaking bits here and there, never actually knowing what they are really doing in any great detail.
As the article states "a good programmer knows that it is how you think, not the language you code in, that determines your ability." Unfortunately, recruiters aren't good programmers!
With a wife who has a PHD I find the issue of postgrad lecturing difficult. The problem is that many universities will not consider someone highly, even as a basic lecturer, who has not got some teaching experience. I think the real problem is that there's no formally accepted training system in universities for 'teacher training', but you can see the reasons for this. PHD students have already spent at least 5 years as a student, and are looking at at least another 3 years on top of that, so what do you think would happen if they were required to spend another year or two doing teacher training to become a lecturer? It's practically impossible to fit in proper training in a Masters or PHD schedule without increasing the length (for some people I could see how this could be done, but it wouldn't be fair on those with heavier content doctorates, and it might also affect private funding if they thought that some of their money was going towards teacher training instead of benefiting the private sector). You would end up with more and more postgrads going into research and private jobs and less and less lecturers, which in turn would lead to less qualified lecturers as universities tried to fill positions with the people that were willing to learn to teach rather than spend time on their own subject.
I can fully understand that it's not a perfect process, but teaching experience is vital to PHD students if they wish to pursue lecturing. What I think could be done is to make sure that postgrads begin by teaching tutorial groups instead of full lectures, but, being bluntly honest, when I was at uni I had my fair share of bad full-time lecturers (who seemed glued to just reading out their notes) as well as plenty of very good ones, and I think one thing that you should take away from uni is that you won't always get the best person communicating the content, but that it's the content that's important. If you can't grasp this I can see it being very difficult to operate in a business setting.
My Uni teaches C++ to compsis (and "C with elements of C++" to physicists), together with Java, MIPS assembly and SML, and expects students to learn any language they may need to do a chosen project on their own, and majority of people are doing pretty well at that. Programming is considered an useful skill rather than goal of Computer Science degree and people are generally more concerned with formal approach (algorithms being one aspect of it).
I, therefore, fail to see any problem.
At least in Cambridge.
Likewise Imperial,
I hardly see F# being a useful end point for Computer Science graduates. Maybe for a bitter IT hack... But a computer scientist, with a knowledge of more than just this months flavour language, is infinitely more useful.
The idea that "only Queen Mary's" produces students able to write C++ is frankly ridiculous, perhaps the author should try contacting some other universities.
And bemoaning the lack of IT skills in Computer Science graduates shows nothing more than a distinct lack of knowledge of the field. If I end up stuck in an IT job after graduating as a scientist/engineer I will hang myself.
This post has been deleted by its author
I read this with genuine interest assuming it was an employer, until I saw it was a recruitment agent.
Now it makes perfect sense: you ask for a Java developer and they send you someone with a CV full of "skills" you need to look up on wikipedia.
Meanwhile they tell graduates they can be on 100K.
... teachers aren't paid enough! If you're good enough at CS, do you a) go and teach for 21k a year, and all the crap that comes with working with kids or b) start on a similar wage with fairly easy and clear routes to far more cash if you have half a brain cell, not to mention more preferable working environment.
The trouble at school level is that computing is a subject in a lot of schools led by business, the curriculum and resources are business oriented.
Oh and, I'm a teacher who teaches programming... starting at the age of 10. Can you stand to call me a teacher?
...until after you've had another career first. Otherwise it is purely regurgitation/recitation and either a) a waste of good talent (if the teacher has skills) or b) a complete waste (if he/she does not).
Old b@stards who've retired should be recruited back to teach and everyone 'teaching' who has less than 6-10 years working in their field should be canned. IMO.
I blame GCSE computers studies. When I started learning it for O-level,they taught you about computer chips and the various parts and what they did, then they changes it for GCSE which meant , basically word processing and teaching to the lowest common denominator.
Computers studies in the west is now aimed at the mouth breathers rather than anyone with a modicum of intelligence, seriously going to hell in a hand cart!
It always pays/amuses to ask stunningly basic questions in an interview - "What does HTML stand for?", "Where did Google get the name?", "How do you get into the BIOS set-up on a typical PC?"
I doubt most IT managers I've worked with in the UK could answer any of those questions - the majority of developers couldn't answer half of them.
But how much is the educational system and how much is that computers have gone from cool and mysterious to tools and accessible?
I did a degree in Network Computing, Java was trusted down my throat and was the only subject that was mandatory to choose from in my 1st year.
I didn't care about java, didn't need it, didn't want to do it.
C++ i did a collage and i loved it. Apart from my Cisco CCNA course (that was the 'network' computing part of the degree), i don't use anything else i learnt at uni.
Couldn't agree more and I witnessed the decline of CS "education" first hand. When I started college 10+ years ago as a CS major we started off with the down and dirty "guts" of computing (assembly, C++, algorithms etc). By the time I gradated 4 years later the program had morphed into "lets just teach Java". My entire graduating CS class signed a letter to the governing board of the school basically telling them that their program was sh*t and we wanted our money back (which we didn't get but it still felt good to demand it). Luckily I had had a couple of good internships in the "real world" that let me develop real skills since I wasn't getting any help in class....
"and other nonsense where you write essays rather than think"
In my days getting a B.A., one was expected to do a bit of thinking along with the writing. Having said that, I do think one could leave the ethics courses to the Philosophy Department, and perhaps computers and society could be handed off to Sociology.
But I also think that an enthusiasm for arguments on iPhones v. Android does not imply ability, maybe not even willingness to take on the details of programming. I had much rather see the secondary schools make sure that the students can handle math and can write clearly. Even a devotion to coding in high school proves only so much. Years ago I shared a room with other contractors. One spoke of summers spent writing 10-thousand line C++ programs without any documentation. Another, who clearly did not think highly of the guy's skills, allowed that he spent his own teenage summers hanging out at the pool.
As for colleges, I don't know what they should teach. It appears that some of the better US schools have thing for ML these days--Penn's compilers course uses ML, and I think Harvard teaches its intro course with it. In general, I'd say that you need to find a problem and then a tool to use for it; a compilers course taught me more about C than the C class.; the scripting languages that I have learned, I learned first for specific purposes. I Perhaps one could insist on a practicum.
Finally, the CS professors you refer to sound like head of the French Department in Nabokov's novel Pnin, whose qualifications were that he had no French and hated literature. Any but the most ruthless schools will accumulate some of these folks.
Haskall. Useless language which teaches "the theory of programming" to people without actually teaching what you are programming. Or why.
When I did my non-IT degree (Biology!), we learned how a computer worked, what the bits were for, and were given an enthusiasm to tinker and break things to see what would happen. This isn't just an "in my day" statement, but today's CS students really are the worst level of IT tech - not even call centre material. I regularly hear CS students ask what an IP address is, or a proxy. Some even call their base units "hard disks". It is pathetic, and the only answer is to stop funding CS departments - merge them back into Mathematics and Physics where they came from in the eighties. Maybe then they will get back to teaching enthusiasm for technology.
Anon - I work at a uni.
I totally agree with everything said here. Unfortunately IT within schools is usually focused on using computers only as a glorified calculator or typewriter, so even though computers are used a lot more widely than they used to be they are also used in a lot more narrow ways. When I was at school in the 90s we were all taught LOGO which is a great beginning into computer programming and the logic needed for understanding algorithms.
By the time I came to my GCSE & A levels I personally had dropped IT as an official subject, as it was by then more about using computers than really understanding them. At university I studied electrical engineering which did have a fair amount of programming, as well as requiring understanding of the underlying principles of computers, namely how logic works.
I feel your pain and frustration in finding people to fill IT roles. It can be very hard to find someone to fit an open position, regardless of the level of salary being offered. Formal education is quite a poor indicator of suitability as I find many great candidates have never studied IT/CS formally, but instead have done engineering or mathematical degrees. Equally a lot of those with CS degrees just don't understand the subject well enough and have only learned very narrow skills, such as programming in a particular language (and then generally only well enough to pass coursework assignments) rather than having the much wider understanding and skills which are a lot more useful.
I would much rather employ a "general" programmer who has sufficient breadth of understanding & drive, who can learn the specifics of the language/environment I'm needing than someone who only knows Java or PHP and would be totally lost if you asked them to look at something written in a different language.
Having recently interviewed a number of CS grads for a networking post i was dismayed about how little they knew about the fundamentals of networking.
One applicant had Networking and Advanced Networking modules on their CV but could not explain the difference between a router and a switch.
I also agree that unlike some subjects a CS grad has to have a genuine interest in the subject. Maybe when Universities start having to publish REAL employment stats they will worry about having relevant course content.
1950. You leave school at 16 with the obligatory 2 O-levels and a budgerigar. You go up to the local factory and learn the mechanical skills that allow you to be a fitter, or an electrician or a welder. That's all you learn and spend the rest of your working life performing that one function.
2010 You leave school at 18 with 4 A levels and the obligatory iPhone. You go to the university and learn the mechanical skills to write code in Java or C#. That's all you learn and you spend the rest of your working life flipping burgers, as there are 100 million far-eastern programmers who can do exactly the same, but for one-third the pay.
The conclusion is that simply learning a computer language means nothing - anyone can do it and it's not especially hard to do. Spending three years in a classroom and getting into debt is a particularly inefficient way to do it - especially when you consider that the stuff you learned in your first year will be almost obsolete by the time you graduate and really, it's only you final year's worth of learning that comes close to what the grown-ups in business will want from you. As it is, the distinction between a programmer and an office worker who has read Excel for Complete Idiots" is somewhat slight and it's often difficult to tell which one is more valuable to a business. So it should come as no surprise that recruiters are less than impressed about the words on your CompSci degree certificate and care more about how you answer real,, practical questions in the job interview.
All the degree does is get you in front of "the guy", although not as effectively as a funny handshake, or the abillity to play golf.
This article opens with a comment about algorithms, but then goes on to discussing programming languages. Programming languages and the relative usefulness and popularity thereof are not what computer science is about (or at least wasn't when I got taught).
Programming languages are merely instances of programming paradigms (of which I learnt several). Algorithms span languages (and most of the time paradigms). Most real world problems can be reduced to one or more well understood algorithms. Each of these algorithms offer different overheads. All of this can be worked out without ever turning on a computer.
What the author seems to be complaining about though is not people merely missing this knowledge (and a lot are missing it). He seems to be complaining about the lack of ability to apply this theoretical knowledge, and the lack of experience of mucking about with computers just for kicks.
I don't blame the universities for that - I blame the sealed slick computer experience that is Macs, Windows, Facebook, Google etc. It is entirely analogous to the mechanical knowledge that is disappearing due to cars being only fixable by the garage, and consumer electronics so cheap that there is little point finding out how they work. If a 9 yr old knows how to program then good on em, but the days of the bedroom programmer are gone.
We (at the University of Groningen, the Netherlands) teach algorithms, and different programming paradigms (imperative (in C), OO (in Java), functional (Haskel), etc). This teaches them how to learn a new language. Thus, if parallel computing can be done more effectively in FORTRAN90 on some machine, or a project is coded in C++, we expect our students to get to grips with those languages in their own time (and they do). We also teach the inner workings of computers, and networks, databases, you name it. Typically, about 95% of students find a good job within one month of graduating. The rest first take a holiday.
We do however include presentation and writing skills, and ethics as integral part of our "hard core" computer science courses, because it IS important that CS grads express themselves well, and have some sense of ethics, as applied to CS.
In terms of algorithms, Java (and its extensive class library) and C++ (with the Standard Library) are equally culpable in allow programmers to wield powerful algorithms without any understanding. Hell, I didn't even learn a real language when studying algorithms (and those all important data structures) back in 2001, all we used was the meta-language Hoare created.
Anywho, lets get back to blaming these damn youngsters for not have the skills to create jobs and nurture talent before they have even gotten a foot in any doors anywhere.
My experience is that Macintosh and Windows users (if that's all they use) do not see "computing" as a verb, which is fine -- unless they're looking for a career in I/T.
Couple this w/ a general dumbing down of the populace and a "lowering of the bar", not to mention (oh, someone already has) a total focus in secondary education on training students to get jobs rather than educating them, and the end result is CS graduates that can't spell "IP address" much less statically assign one properly.
As a working class kid who paid every cent of my own books and tuition, I understand the importance of having marketable skills when you graduate (particularly these days), but many of those courses they made us take because they were "good for our character", actually turned out to be. (Even if I did rename one PHI 102 class "Intro to Critical and Useless Thinking" at the time).
OTOH, a lot of that old-school CS curricula were also overly theoretical and in 30 years in the field I've yet to find a use for some of it.
I manage a group of four IT people, none of whom (myself excluded) have a CS degree, and I wouldn't trade them for a PHD in CS w/ a string of certs.
It's the "esprit de geek" that makes the difference.
"Although I’m cynical about all this, I still was shocked by the revelation that CS isn’t actually a popular subject among teenagers smart enough to do it properly."
I blame Dilbert. Every day there's a cartoon detailing in excruciatingly accurate detail exactly why a job in IT is a pointless endeavour.
Seriously though, teaching standards do leave a lot to be desired. I was surprised when I asked my five year old son what he had done at school and he told me "We learnt about relational databases." I didn't know what a relational database was until I was about 16.
But I was even more surprised to find out that they've made no attempt in the last 4 years to take his knowledge past the concept of using computers as tools to do boring stuff. There's no explanation of how they work, or how these magical programs appear on a computer in the first place.
I've started teaching my son to program - he can write reasonably complex stuff in C, C++, and Java - and he's shown huge improvements in other areas because of it. For instance, he's now a year ahead in mathematics, and the improvements in his logic and reasoning skills have drawn lots of comments from his teachers. The best part is that he loves doing it (he's a bit of a control freak and, unlike people, computers don't get moody when you tell them what to do!)
While I don't agree that everyone should be taught programming skills, I do think that teachers should be spotting kids with the necessary skills and motivation and giving them differently structured lessons.
Up until a couple of years back I did a lot of hiring. I would always choose the people with the right attitude (aka the bushy tail factor), a natural curiosity and the ability to learn. I hardly ever cared what languages someone knows.
Someone with the right ability to learn can easily pick up a new language.
Someone that lacks the ability to rapidly assimilate a new language etc is a waste of time. In this industry everything is changing and if you can't handle change you are useful.
I once had to hire someone to do embedded firmware development in C, writing drivers etc. The guy we eventually chose was a business grad with Visual Basic skills. No familiarity with C. He learned everything really quickly. Within a month he was reading schematics and writing device driver code in C, better than many people that had doing this for years.
We did (and UoD still does):
C/C++ (up to programming a binary tree)
Java for GUI stuff
bunch of web authoring (sadly no php)
C++ OpenGL stuff
Networking (looking at data packets structure, parity etc)
Computer architecture (registers, caches, busses, ques etc)
I found no difficulty picking up C# during my 3rd year placement and using it in my honours project (along with php) and even iOS programming wasn't too difficult to learn.
Lots of courses offer a final year project and students can generally use this as an opportunity to pick up a language they haven't been taught.
This post has been deleted by its author
Wikipedia fails to mention that nobody except sheltered academics actually uses the *bibyte terms for anything, because they sound like something a dim three-year-old would babble. Here in the actual IT industry, I find people generally prefer instead to go on as we've always done: talking about kilobytes and megabytes and meaning 2^10 and 2^20 bytes respectively, specifying quantities on the rare occasions when it's actually worth asking "excuse me, is that a 2^10-byte kilobyte or a 10^3-byte one?", and laughing ourselves silly at people who actually try to get terms like "kibibyte" and "mebibyte" and "gibibyte" into live usage.
No it isn't. Not more so than in the days of yonder, anyway. So a HD 3.5" "stiffy" has 2880 sectors of 512 bytes each. How many megabytes, base 10 or base 2, would that be? Go on, do the math.
What _is_ lacking is enough knowledgeable people insisting that base 2 units are used where appropriate. With or without SI silliness--which I don't see being used except by some annoying fanbois. You can run and hide among the ibis, but soon a marketeer will take notice and use the terms, but take the base to be 10. What was true in the sixties is true now. Marketeers lie. "For marketing reasons."
> http://en.wikipedia.org/wiki/Kibibyte
>
>vs.
>
>http://en.wikipedia.org/wiki/Kilobyte
SI units are meant to be more practical and convenient than their traditional counterparts. They aren't meant to be just some other random fiat handed down from on high. If that's all that they really are, then there's no point in using them versus traditional units.
If you think that a base 10 definition of a computing quantity makes sense then you suffer from the same very superficial level of understanding that the article was complaining about.
Anything that increases the number of significant figures I must use is loaded with fail.
...you can take your 'kibibytes' and shove them up your arse.
Any word, technical or otherwise, in any language only means what everyone thinks it means. The meaning doesn't change because the OED, the IEC or bloody wikipedia decides it means something else.
You can take my 1024 bit kilobytes out of my cold dead hands (thinking about that earlier post, maybe I have spent a little too long writing assembler after all).
Students seem be be told that there is a set mechanical way of getting through life and then they're shocked when life actually comes and bites them on the IT.
University has become what it was never meant to be. Rather than being a place where those who had learned how to think during the A-Level years started being innovative and think up wild ideas and dreams and try to make them happen, it became a place where young people can learn how to doze away a few more years. Universities, educational laws and the touchy-feely PC society don't exactly help, where students should be shoved into a scary arena and whirlwind of ideas the system has changed where students are fitted into a nice, dull and straightforward set of exam-learning...and that was what should have been done at the GCSE stage.
My computer science course included, inter alia, FORTRAN, COBOL, ALGOL 68 and LISP as well as C and machine code (entered using toggle switches and paper tape) for a variety of different hardware architectures - though most of the course was taught in a language not much used outside the department. All exercises done using a line editor and batch compilation. And it lasted only two years and managed to throw in a fair bit of hands-on hardware lab time building circuits on breadboards as well as looking at algorithms, data structures, numerical analysis and compiler and operating system design.
I'm not sure how a three-year course with modern development tools manages to fit in so little - but I have had the living proofs sitting in front of me bereft of insight when Intellisense has failed to come up with the right answer.
Of course, this is what comes of treating Computer Science as an IT training course. I can understand some former polys being demand-led by students wanting to come out equipped with the buzzwords that vacuum-headed recruitment drones will understand. However, I'd have hoped the more reputable universities might have held out and actually taught principles before practice.
FYI: Algol 68 is available at sourceforge. (Including an easy interpreter and the ELLA Algol68RS translator)
It is interesting to see how much has "advanced" in IT since the Apollo 8 crew Borman, Lovell and Anders first eye-balled an Earthrise in December 1968!
Enjoy NevilleDNZ
I guess I'm still a young(ish) comp sci grad - 26, fortunately my dad was a geek and I grew up programming BASIC , VB, telnetting to random korean servers etc etc.
Computer Science A-Level at school was pretty decent, far better than any of the ICT qualifications (we actually did assembly), but the first year of university was so basic. An hours lecture on accessors and mutators in the first week meant I skipped the majority of lectures and just got down to completing assignments the night before they were due.
We did the majority of the course in Java, had small experience of C++, did operating system internals with OCCAM, algorithms in Haskell, some SQL, systems architecture etc. I guess it is a reasonably good course, with some interesting options, but probably not enough emphasis on what happens in industry or developing each skill past 1-2 short assignments. I skipped the year in business, probably a mistake!
Then I went into teaching... Yes all the ICT courses love office applications, little opportunity to do programming in the school's syllabus. Bored me (and the kids) to tears so I got out.
Annoying part is, most IT ads ask for 1-3 years experience in a language/technology. Good comp sci grads should pick up the fundamentals of a language in a few days, is a years experience really required? I don't match most of the requirements for most jobs but I'm pretty sure I could do them.
Just joined a company under the grad scheme so hopefully I can start building up all this 'required' experience.
Another student here who has had the opportunity to learn plenty of different languages and algorithms.
I won't deny, we did do some Java in our first semester of Level 1 but by second semester we were already deep into C/C++. We're currently coding on Playstation 2 Linux and next week begin Low Level programming on them (Roll on the frustration). The only Java I've seen in the last three years was during my brief flirtation with Android last summer.
I have no complaints and neither did my employer during my internship. Seems like Staffordshire is doing a good job then.
Was just wondering what had become of the teaching there.
I did a Bus/IT mix in mid 90s, but remember the bewildering (to me as a non-programmer) array of languages the CS guys had to learn. Wasn't my forte at the time (tho did a load at A Level in the 80s), but those who studied CS relished it.
Now, having abandoned Bus & Marketing for IT, wish I'd maybe chosen some programming modules. tho my course did me no harm.
Have one for me at the bar. :)
nK
Where do you start? Scratch (http://scratch.mit.edu/) seems like a reasonable place to start these days, and I was happy to see that it is used at my childrens' primary and secondary schools. I did also try teaching my daughter some perl, which she found interesting but she would generally rather be playing Club Penguin...
You do need the passion in programming and tech to keep on doing it well. I know it is a bit unfashionable to be a “geek” but for certain jobs being a geek is a good step up ( though possibly not in UI design).
I started dicking around with computers in my teens while still at secondary school. Lucky enough to have a Dragon 32 with a DASM Demon cartridge , I wrote my first assembly invert screen program. From then on I was addicted.
Enjoyed hacking lords of Midnight on my Spectrum for lots magical items and instant teleportation ( I got a military victory before hacking it). Later on soldered in extra ram in my Atari ST ( remember Blood Money ).
Later on did some electronics then an diploma in system software. Leading in to real-time C, then C++.
Now I do C++ and C# in the day time job.
But last night I was soldering relays to the Panda Fez, progressing the automated cat flap project ( though doing it in C# just seemed like cheating ). Sometimes you are just hooked!
The tone is bang on; grads who *only* know java are going to be pretty useless regardless of how highly (or not) you value the language.
How about grads that have *never* used an SCM?
When I'm recruiting (doesn't happen often; I'm lucky enough to have a pretty good retention ratio)
a) Anyone who doesn't know how to use styles in MS Word goes in the bin
b) if you pass that, then you get a go-away-and-do-it test, I don't need people who know the syntax of Java; I want people who can construct a solution based on a statement of the problem.
c) Then I interview you down the pub; because after a+b, it's all about the fit in the team.
In my four years at university, there has only been one lecturer that has taught a technical subject, knowledgeable, articulately and kept it interesting, the rest have mumbled their way through lectures, and sent half the class to sleep by the end of the 1st half of the lecture. It isn't good enough, not when people are spending £3k a year on fees.
In four years, we have had a whopping 3 modules on programming, there is no Algorithm class (though half of an AI class is devoted to algorithms), nothing on Web Programming, nothing on operating systems, A core module is entitled "Organisational and social aspects of computing", which teaches us practically nothing of real world value.
We were taught Database management using Access as the DBMS in the practicals.
In defence of my fellow students, despite only being taught Java(inc Groovy), almost no-one i know is using Java to code their final projects.
"In defence of my fellow students, despite only being taught Java(inc Groovy), almost no-one i know is using Java to code their final projects."
You should use the tool appropriate for the job.
The light-saber may be an elegant tool, but in order to avoid scarcity of time or falling into the trap of application details, a more proletarian blaster may be needed.
This article seems a tad exaggerated. I finished my CompSci course 3 years ago at Essex and whilst, yes, Java was the main language of the course, we started with C first and had options to also do C++, C# and Ruby as part of the syllabus.
I didn't do the C# module but now write it professionally, due to the fact that it's near identical to Java. So this idea that somehow learning Java doesn't prepare you for Microsoft jobs is a bit silly.
At my company though we do have major problems finding new Developers, this is mainly trying to find Seniors though, we get a lot of people who claim to have 10 years experience but can't do anything useful.
The main problem at the Unis though is the lecturers. Much like most Project Managers are failed Developers, so are University lecturers (on the whole). Why would you get paid ~£25k to teach programming when you could easily get twice that doing it?
"The main problem at the Unis though is the lecturers. Much like most Project Managers are failed Developers, so are University lecturers (on the whole). Why would you get paid ~£25k to teach programming when you could easily get twice that doing it?"
Why, because I am passionate about science and do not want to be told what to investigate (besides, a lecturer earns quite a bit more than £25k down here). My brother works in (non-ICT) industry in R&D, and is sometimes exasperated at the fact that management almost always looks at short-term gains, not and long-term benefits of research. I can suddenly decide to see if I can improve someone else's algorithm by an order of magnitude, without EVER having to explain this to anyone. Provided I get interesting results that I can publish in leading journals, nobody complains.
Just recently I decided to branch out into satellite image analysis, and collaborate with some guys in Italy. When you get the wall-clock processing time of a rubble-detector for earthquake relief efforts, for a 1.5TB data set from the Haiti earthquake down from 34,000 years to 1.5 hours (by first going from a naive, O(N^2) algorithm to O(N log N), and then parallellizing it) this is REALLY cool. I could not make such snap decisions in industry.
Many lecturers here have a similar passion for research. In many, this passion spills over in teaching.
Spot on. Have been looking at senior schools for my daughter and I was horrified to learn that not one of them (including some very good grammar schools and top independents (to cover the full bursary angle)) teaches CompSci AT ALL - well, one does for A-level only, and the teacher was very proudly showing some of their coding work... space invaders!!! The sort of coding I was doing on my Speccy at the age of 12 in 1984. They only teach ICT (Word, Excel, Adobe). Excuses ranged from "we don't want to teach a subject that is hard because it will affect our league table ranking" to "we'd love to but we can't get decent teachers" to the most gob-smacking "we're not interested in teaching programming because we think Word is enough".
When I left Uni, some of the freshers coming into comp sci had no interest in or experience of computers, as mentioned above.
Commenting not specifically programming, but technical jobs in general:
I was a lecturer and the biggest problems I have seen are related to motivation. I have countless stories of students who "can't be bothered", they aren't self-motivated to work hard and get results. Yes, it is the duty of a lecturer to inspire, but when good students aren't even bothering to go for decent qualifications in the first place, or drop out within the first week because it is a bit difficult getting up in the morning, then what are you to do?
Yes, you could blame funding, but funding will always be tight and you have to make do with what you have to hand. In my experience senior academic administration doesn't help, being administratively disorganised and unappreciative of industry requirements. A great many of the "employability" courses that are added to the curriculum are useful, however sometimes a course leader will end up putting things in there to meet some obtuse requirement that made it into the validation documents and also to avoid the hard grind of teaching to depth on any one subject.
Mainly though I think we need to look more at family and the messages that society brings than the courses themselves. British society doesn't tell you to aspire and to work hard, it tells you to win something and/or become a 'famous' 's'leb'. Betterment is seen as discriminatory and deprivation is a right for individuals as much as a burden on society. People shouldn't be forced to be better, they should be allowed to be squalled. The everyone shouldn't have to actually do anything for the money they *deserve*, they should just turn up for a few hours, surf facebook and get paid an executive rate. *Of course* the poor *deserve* their iPhones, designer trainers and lifetime supply of JJB sportswear!??
I highlight the bottom of the market because that also reflects all the way up to the middle-class and beyond. The British *CBA* attitude needs to be tackled as much as anything else. Then perhaps students would be self-motivated to tackle the big challenges and to make more with the wealth of opportunity that being in Britain has to offer.
"Mainly though I think we need to look more at family and the messages that society brings than the courses themselves. British society doesn't tell you to aspire and to work hard, it tells you to win something and/or become a 'famous' 's'leb'."
Personally I always took away the lesson that if you have money/connections you will go far; if your parents can pay for tutors and access, guilt you about the amount they are spending, and reward you with a brand new car to take you to the uni flat that they bought and which you will "rent"... then yes you will go far since you either find motivation or top yourself over the pressure.
The professions are now more of a closed shop than ever despite years of Labour government, so unless you have a close relative able to look out for you, you might as well debase yourself before the Cowell, because all the motivation and brains will only get you so far under your own power and likely bring your close enough to see the ass-masters browsing facebook at an executive rate.
The good news for compsci grads it that IT is barely a profession any more thanks to the jaw-dropping efforts of the BCS, but it is still bad news for those wishing to break in to law, accounting, banking, academia, medicine, etc.
Teachers seem to think that Microsoft is the be-all and end-all, and as a result tend to teach kids how to get a MCSE, rather than a MSCE ... Unfortunately, the former consists of learning to pass a series of tests, instead of actually learning the subject matter, as in the later.
"and who are thus choosing between a degree in CS or accountancy"
I wish I had done Accountancy instead, I would have been chartered by now and earning a small fortunate for doing something which only required diligent, albeit boring, work.
Instead, I'm now doing stuff which requires diligent albeit boring work for far less than the accountancy monkeys with similar number of years in the job.
"It therefore attracts a lot people who cannot write English well enough to do an arts subject"
Like, for instance, the author of the article? (And since any speling or grammatical flame should itself have misteaks, I'm doing deliberately.)
Not that real programmers have ever been noted for their ability to use English. About the best you can say of their writing is that it's probably more legible than that of doctors, probably because it's more likely to be typed (or in my day in capitals so that the punched-card girls could read it).
I graduated 32 years ago (ye gods!), and even then most "CompSci" courses were tending to focus on applications rather than algorithms, and many seemed to be assuming that hardware and operating systems were magically "just there" and could be ignored (Pascal was at that time the CompSci language of choice, a language which by design read all its input before it started and wrote all its output after ending, thus making it impossible for any interactive or real-time work; the extensions to get round that are really horrible). Having seen the type of graduates since I'm not at all surprised that they can't get work.
Even worse, they tend to lie about what they know on their CVs. They "tick the boxes" for Java, C, C++, etc., but when you actually ask them simple things (like what is meant be an abstract class in an OO language) they don't even know stuff which was supposedly in the syllabus. Never mind 'hard' things like memory management...
Due to it being to hard for most of the plebs to do.... and replaced it with a course which culminated in creating a flash button that played a sound on click.
Then fired the academics who actually knew comp sci and replaced them with some media wonks who encouraged people to add them to face book and follow them on twitter.... WTF, bugger muppets is all I can say
Bear in mind I graduated with people who still struggled with anything more complex than an if else statements. But from my experience that is actually quite a high degree of competency from a graduate
[graduated from a uni just off the m6 between Birmingham and Manchester but south of Keele....]
Whatever University it is, I'm completely shocked by their description of their "Computer Science":
"Computer science is the study of computers – the physical components (hardware) that make up a computer, the operating system that allows programs to run and the software (programs) that control what a computer does. You will learn how these work together and interface with the world outside the computer, to make computer systems such as desktop systems, games consoles, computer networks and intelligent controllers for production lines and aeroplanes."
That sounds more like a cross between (basic) computer systems engineering and ICT. Where are algorithms? Discrete Maths? Data Structures? Functional languages? Compiler design? Formal language theory, including grammars and parsers? Number theory?
Dammit, proper CS is pretty much a maths degree, not a blimmin' "How a computer works" course.
I did software engineering rather than CompSci, but there was allsorts in it, a large amount common with CompSci, but in much more depth. It may have been the uni, but compsci was seen as the simpler option compared to the other IT courses :)
Maths and logic were a large part, but it generally covered the whole process of creating software, not just programming and computer hardware. While i studied programming concepts (languages were something you had to just go away and learn if you wanted to, you were just expected to know several, no specifics), how they are implemented, algorithms and so on. I also studied a large number of ancillary subjects. Psychology and the human cognitive model (user interface design), law (data protection etc), speech (language processing and simulation) and so on.
Maths should be a part, but not necessarily the majority, if you want to be a hardcore low level programmer then yes, maths is a large part. But if you realistically want to be create anything in a development team, there is a lot more varied knowledge required.
2 modules courses on algorithms, one is required.
2 modules courses on processor design, one is required.
1st year programming taught in Haskell.
2 modules under the title Logic (natural deduction, lambda calculus etc..), both required.
Database first principals as a required course.
2 modules on OS design, one required, including a practical course work requiring you to re-write the Minix debugger.
I'm pretty sure that even without a "C++ programming" module IC CS grads think in the right way.
Yes, Concurrent Systems is taught with Java as the implementation language, but I'm pretty sure that the principals of avoiding deadlock and using mutex's work in most languages.
Since most of IC's CS grads work for banks in the city, I should think they are doing something right w.r.t. employability so maybe other universities should take note?
The author seems to contradict himself, bemoaning the lack of a broad set of languages, and then highlighting the fact that "a good programmer knows that it is how you think, not the language you code in, that determines your ability." What does the author want?
There are a number of problems with teaching CS in higher education. The author highlights a number of these, but then blames the universities for not handling these problems, instead of looking to the source of the problems in the first place. For instance, universities can't assume that any new entrant has any foundation in aspects of computing, and must spend significant amounts of time brining them up to speed. The effect is less time teaching higher level concepts that the author wants. The true solution would be to ensure that the secondary eduction is up to par for entrance in to university in the first place, to allow for more time for more languages, deeper concepts, further specialism, for example.
The secondary education systems (there are more than one in the UK, after all) need to address the deficits in their curricula for CS, both in terms of showing the opportunities there are for careers, and the wonders of the science itself, to get that passion. The universities can only work with what they are given. Whilst yes, they could probably work -better- with what they are given just now, ultimately there are more parties at play all the way from the first steps in education that are just as important, if not more so due to the length of time someone spends in pre-higher education.
Lastly, 'Ethics' and 'Computers and Society' are some topics that are dictated by the BCS for accreditation.
I agree in principle with everything the author decries about graduates currently, but the universities are not the only ones to blame - the image of CS as a whole needs a facelift from everyone involved and at all levels. Industry needs to highlight their needs to universities and the BCS, the BCS and universities to secondary education, and secondary education to primary education. And any shift will take a decade to really see.
Two things:
1. He wants employers to think, "this is the conscientious, uncompromising recruiter for us".
2. He wants those graduates who feel slighted by his article to get in touch to "prove him wrong", with the fortunate side effect that he has lots of bums to stick on the vacant seats owned by the employers mentioned at 1 above.
You expect people taking non-vocational courses to have vocational skills? How about *gasp* on the job training? Rather than moan about the lack of skills people have, you could do something about it.
There's nothing more depressing for a graduate web developer to be faced with the following list of 'essential' skills: HTML, CSS, Javascript, JQuery, Flash, PHP, ASP, c#, SQL, active directory, Linux, Exchange, active directory. That's for a junior position (2 years minimum experience needed naturally).
Chances are, several of those won't be used and will barely be used, the rest can be picked up fairly quickly if you've a decent set of base skills. However, requiring someone to be skilled at all of those before they start is stupid (unless you're offering a high paid position).
Advertise that same position with the bare minimum base skills (markup stuff + the main language they'll use), you'll get 3 times the number of applicants to choose from and you'll find it far easier to find someone not only willing to learn, but willing to do the job for £5-8K a year less than what you'd pay an experienced coder.
Plenty of workplaces are perfectly willing to employ people with a minimal knowledge of Office and train them up, a car garage wouldn't expect someone just out of college to have a huge wealth of knowledge. Why, when it comes to the 'pure' IT roles is there such an unwillingness to provide the training? It can take 3-6 months for someone to gain decent Excel and word skills, why is it so much more unreasonable to allow 3-6 months for a Java programmer to pick up C skills?
"There's nothing more depressing for a graduate web developer to be faced with the following list of 'essential' skills: HTML, CSS, Javascript, JQuery, Flash, PHP, ASP, c#, SQL, active directory, Linux, Exchange, active directory. That's for a junior position"
TICK THEM ALL, THEN B.S. YOUR WAY THROUGH. LOOK UNFAZED BY ANYTHING. LEARN ON THE JOB AS NEEDED.
That's how french candidates do it, personal experience.
The interviewer will have no idea what these things are anyway.
Will anyone even admit to being a "computer programmer" these days? I use this term routinely as it most accurately reflects the skill which defines my job, whether or not I'm actually doing it or planning/managing/talking about it. I've met people with similar jobs say things like "programming is not the most important skill for us, it's communication".
When even the geeks are running away from themselves we're in big trouble.
And don't start me on "Developers, developers, developers"...
w.r.t the whole language debate, whilst I agree that CS students should be exposed to memory management, from what I recall from the early nineties there actually wasn't that much space in the syllabus once they'd shoved in all the esoteric stuff that the academics loved (functional languages, formal logic, ...). Are we teaching computer science or software engineering?
...maybe 5 years ago now. I finished my BSc in CompSci without any real distinction. The lecturers at my chosen University, as mentioned above, were fantastically knowledgeable in their subject areas. Sadly, none of this came with any real ability to present the information they knew in any other medium other than 'Here are some Powerpoint slides. Let me read them out to you.'
As a consequence, I didn't really garner any sort of love for the subject, or in fact end up caring about it that much at all, wondering whether I'd made a horrible mistake. This is both my fault for not really engaging with the lecturer in the first place, and theirs for making the delivery of what is a fundamentally interesting topic so mind-bummingly tedious that no-one ever listened to half of what was going on.
It was only when we started doing project work that it became necessary to know what was going on under the hood. And that's when stuff got interesting (for me at any rate). I found I enjoyed looking up stuff off my own back far more than I should've done. I learnt how to put good, efficient, readable code together, not due to the diligence of my tutors, but because I read shitloads of online articles, books, and forums. As such, when I finished my BSc course with a Desmond, and my compatriots flooded into the Job Market, I stayed on, studied an Msc in a specialisation I was interested in, and within a month of leaving had a well-paid job at a multi-national, doing interesting work in multiple languages.
The point here is twofold. One, it doesn't matter how interesting the topic looks on paper, if lecturers can't deliver in a well presented and interesting style, then they may as well just email out the Powerpoint slides. You can slam the Grads all you like, but they are a product of a broken system. Make it worth their while to do background reading, and research around a topic, and you might find that suddenly, they're not so biblically apathetic. Two, at the end of the day, you really learn to develop and engineer in your first job. What is on your CV before that point is enough to get you through the door and to a desk. Exposure to professionals who've been doing the job on a day-to-day basis for years is when you really start to learn how to get to grips with the subject. I work with fantastic people who know their stuff inside and out, and I've learnt more from them in 4 years than I ever did in a university hall. However, without the years spent in lectures I never would've got through the door in the first place.
Being one of these geeks that is a product of the post 1990's ICT (my fingers feel so dirty from typing those three letters!) education system.. I couldn't aree with you all! the education system need to be reformed so that REAL members of the IT community can teach the next generation, rather than a English Teacher who simply fell into IT as the school had too many English teachers! that was some terrible times I had!
I decided to skip Uni purely for this reason and start from the ground up making tea for the IT department. 6 years on im managing my own IT department and the only thing that i put it down to is my passion for IT (and some truly great IT mentors). I still spend my spare times browsing amazon for IT books that I can learn and have broken my home servers countless times!
I still get CV's from people (im too young to even say kids!) which are the same age as me but have nothing in their CV's that truly make them standout! I have got to a point I now just sit in the interviews and ask them if they do anything IT related out of their working time... if they dont, they are out of that door faster than the French barricading a petrol station!
I am almost ashamed to be a member of the next generation's IT department.
(Grenade as thats what I feel like posting to each CS grad after typing this)
The problem is the differences between requirements and expectations.
Today's "computer" courses, from the primary school all the way up, are for learning to use them - the driving lesson approach, if you like. That's great, but they are not what the industry needs.
The need is for people who can build, maintain and fix the "cars" - the engineers, if you like - and those courses are simply not taught on a formal level. Even in the late 80s, doing a BSc in "Computer & Communications Systems", most of us were self-taught through tinkering and BBS life.
Yes, I did an Electronics degree with a Computing side module. (Which in electronics tends to mean embedded systems, or at least it did in the early 1990's.) During this, i learned computer internals, two variants of Assembly (6809 and z80), pascal, C and a little C++. and it was fun! This was one module! As an side cos i found it interesting, i did some VAX assembly language and some Sockets stuff in my own time. Ah fun!
The latest batch of CS graduates appear to have either a bit of Java or C# and, if your very lucky, occasionally some SQL. Most of them seem to have skipped the bits on things like proper exception handling, and some of them cant even debug properly. I am honestly at a total loss as to what some of them did for three years!
AC to protect the guilty.
Much of the spirit of this article is correct but it's also narrowly scoped and fails to apply to most of the job market. There's more out there but let's stick with jobs for which the main candidate background is Computer Science. Not all of these are about algorithms or maths. I'm barely competent at either but short of the Google recruitment process, I've never had much use for them in my line of work.
A lot of jobs, like mine, are all about engineering *within a business* - the ability to work in teams, interface with customers, produce usable systems and be able to write English in a business context. Provided you can show the aptitude to learn, then all of this is immeasurably more important than the specific programming languages or technical detail that you know right now. It's this where so many of my peers fall down, and the bar you have to pass really isn't that high to begin with. In my opinion, this is why there's a lot of grads unable to find work.
I went to a 'good' university, and of the more 'social' ones at that. I felt the technical content was fine but very little of the material - aside from a year in industry - prepares for what a functional business is looking for from you. Placements should be mandatory and the softer side of it all should be emphasised, because there are very few jobs aimed at the purest of geeks.
As a CompSci grad myself now working as a Java developer I agree with Pete 2's points. If you want to be an engineer as I now am, go learn engineering - don't go to university to learn Computer Science. The key's in the name - it's Science. A programming language is an engineering tool, it's just a way of getting machine code written, and nothing to do with science. When I left university I was expected to be able to program in any language, be it object-oriented or functional or procedural or whatever, as it's just a question of learning syntax and implementation details, and not relevant to the solution to the problem at hand.
The problem is not the universities, or the teachers as the author claims - it's the students. On my course (Computer Science) of 200 students at a red brick university I'd guess that 95% of them wanted a job in IT rather than to become Computer Scientists, so of course the university provides what its customers want - Software Engineering. I had to learn about design patterns and stakeholders and Extreme Programming, none of which is anything to do with computer science. There were precious few courses on computational complexity, algorithms or computer algebra, and in none of them were there more than 15 students.
Of course, for me it turned out that I wasn't smart enough to do a PHD in intractability, so I became a developer, but I value what I learned on those courses because it fascinates me; I have no colleagues who could write an efficient sorting implementation.
...Alright, the dictionary definition of 'computer science' may well be algorithms, but it's not where the jobs for 'computer scientists' come from, and thus why unemployment is so high. This can be anywhere from electronics to consultancy & management.
I would never employ anyone who could tell me all about sorting algorithms and yet had no idea what Agile/XP/patterns etc are. That's the CS equivalent of the Lone Ranger and has no place in my environment. Actually, I would jump at the chance to employ a software engineer who could produce a compelling essay from their ethics course.
I consider myself an engineer because what I do is predominantly engineering, and not IT - but it is high level nonetheless. Plenty of jobs are available here, we offer an attractive package, and yet we struggle to find enough appropriate candidates every single time.
He always has to throw in how rich he is.
Anyway, the industry has to share some of the blame here too. They give grants to universities to teach students how to use their products. Except, by the time the student graduates these skills are probably no longer in demand, and whatever demand there is can be met by someone in India or China. I have a degree in CS, I didn't run away from the hard stuff. I have Knuth's books. I doubt some grads these days would know who Knuth was. Tell you what though, I'd hire someone who had no CV other than a cheque for $2.56 from Mr. Knuth himself, over some chump who got a first in Java. I was delighted when I was asked in the interview for my current job about what happens when a CPU receives an interrupt. "This is the place for me", I thought. I turned down writing Excel spreadsheets to take this job even though the salary was probably a fraction of what Excel guys get.
I despair at the lack of knowledge that programmers with years of 'experience' have. Not a clue about bit flags, RPC, how to write a TCP server etc.
The thing these folks often have in common is Java. Seems to be that by abstracting programmers away from the low level stuff, we abstract them away from proper programming techniques.
Make them all learn assembler on 8 bit micros in their first year, build them up to C, then C++ and Java etc. in their second year. Final year should be useful algorithms and programming patterns.
I also happen to know a lot of CS grads that have a wide range of relevant extra-curricular interests, and for whom a degree was simply a foot in the door for a job they already wanted to do.
I can't think of a single field where there is not a wide mixture of attitudes amongst its professionals. Yes, things could be better. No, the world is not going to end.
The problem starts in school long before university: Programmers are not created by teaching how to use MS Office or answering questions like "what is a spreadsheet?" You need to get them writing simple code and getting the satisfaction when the damn thing works. An IT course with without programming is not an IT course.
Sadly, the way of thinking for IT seems to be better served by music syllabuses involving composition than IT courses. (That may sound daft at first, but music is about form, structure and obtaining a satisfying result using the syntax available: Anyone who enjoys harmonising Bach chorales can make a decent living as a programmer.)
IT needs engaged problem-solvers: turning them off the subject dilutes the talent available for the future job-market.
graduated when GST was being done in cobal. (wend)
they took 'personalities' that 'fit' the team.
have coded in c ever since, with passing swipes at javascript, java, CSS and whatever else was necessary to debug.
with promotion hiring pratcies like these (ask nokia + the homeboys)
get used to private consulatants 'cause the hires can't.
packrat
Slightly opinionated article, but then, I agree with it.
I dropped out of a CS degree as I realised in the first year that it wasn't teaching me anything that I would use in the real world, and it was a waste of time and money for a piece of embossed paper.
The three years of industry experienced I gained instead set me up MUCH better for my career, and I'm doing what I love. I feel sorry for my fellow CS students who are now stuck in minimum wage IT support jobs.
This article is an incoherent mix of sweeping generalisations "When I find no alternative but to speak personally to CS grads, I find them a miserable lot who require all my skills to engage in any conversation about anything" and contradictions about whether computing is or is not a good industry to get into.
For all that, I'll fashion a response because publication of these articles seems to be tied to the regular movement of the Earth around the sun.
Here's the thing - studying Computer Science does not prepare young people for the workplace - it's not designed to. Universities are (mostly) places where kids go to prove their overall aptitude - so there's no wonder they are ill prepared for your specific banking job. In the absence of candidates fitting your precise requirements (e.g. knowledge of algorithms) it has to be your place to train them up, and you have to accept that they have the aptitude to learn based on the 18 years of schooling they've just had and the interview you just gave them. They dont know how to code a binary search in C? So what! They got a 2:1 from a red brick and thats good enough for me, and if they need to know how to code the binary search I can point them at a book!
To summarise my first point - to dismiss graduates that don't know algorithms to the level you want *is absolutely fine* - if you can find the graduates with the requisite skill, it's your duty to go for them. But in a scarce market (which the market will always be, because there's a whole lot more to Comp Sci than the algos you are concerned with), to do so is to dismiss candidates' overall aptitude (IQ / ability to learn / whatever) as irrelevant, which seems absurd. And to do so publicly is just bewildering.
Briefly, my second point.
The annual UK Comp Sci graduate cohort is ballpark 30,000 people (ref. 1). Say you run a bank and you will only consider applicants from "certain universities". That limits you to say (being generous); 10000 graduates. Now, say there are around 30 major banks in the UK (2). That means that there are 333 graduates available per bank, if we assume an even spread. But you're only interested in the top 30% so thats a round 100.
Now banking is only a single vertical market, there are a thousand others in the UK, all using the same pool of graduates, so that figure diminishes somewhat.
But wait... you offer 3 times the salary of other companies? Ah, but you're based in the City, so you really pay exactly the same. And you don't pay for continuous training, you don't have a system of mentorship (yes, I know you probably say you do), you use legacy technologies, you didn't pay attention to the physical environment the developers are working in, you expect heroic efforts as a result of poor management and pay no overtime?
Actually, I think I'll study Economics.
(1) http://www.hesa.ac.uk/index.php/content/view/1541/161/
(2) http://en.wikipedia.org/wiki/List_of_banks_in_the_United_Kingdom
Please not the crap that one is taught in school.
We have enough 'tards like this in the Real Economy, bullshitting themselves and the people around them.
Start with "Economics in One Lesson" and work from there.
Of course, if you actually want to work in a Bank, it's better to know the wrong stuff.
Pretty similar to Computer Engineering actually.
In the days when COBOL was the be-all and end-all of business computing my mother trained programmers at a certain large company whose name, before it changed was well known the length of the land.
she said that the Computer Science graduates were lacking and better programmers came from other walks of life - one her best used to be a monk.
I left university 3 years ago with a degree in software engineering.
Frankly, I found the whole of year 1 redundant. Having grown up modding games and creating various programs/websites, spending 6+ lectures on conditional logic was rather pointless. The whole of the first year seemed aimed at people who hadn't touched a computer before let alone people with some amount of programming experience.
Years 2 and 3 were marginally better but I probably increased my overall programming knowledge by 10% at best... and only because I went out of my way to choose a dissertation which stretched my knowledge.
I can see why though it was this way though. There were around 60 students doing computer science courses while i was there. Probably around 75% had little to no programming knowledge prior to the course.
I'd say that the universities probably shouldn't allow people with zero experience on the courses, but they need funding and the few of us who had knowledge of the topic probably wouldn't have been enough to make it financially viable for the university to run the course.
Saying that, I'm sure that a degree in software engineering looks good on my CV even if I would have learned far more sitting at home making stuff for fun.
I studied "Technische Informatik" in the 90s and the institution really taught that. TTL chips, FPGAs (on own initiative, though), Z80 assembler, C, C++, Sockets, Unix, X11, Motif(true horror !), Compiler construction, SQL & relational theory, Operating system design, some math, Physics, Transistors, Frequency Modulation, Channel Capacity (Shannon), simple Amplifiers, Objectoriented Analysis/Design, stupid SA/SD nobody understood, E/R Modeling, "Project Management" which was also quite a mystery, Effective Presentation (some usefulness), Prolog, Smalltalk, Data Structures and Algorithms, Using Oscilloscopes, Logic Analyzers, GPG, RSA, DES. knowing different Capacitor types. And yes, I also learned Java, which has become useless for me.
Nowadays I hear from Students that they no longer learn Compiler Construction, but "using DOM parsers". Not the same uni, but the crap uni here in Frankfurt. I guess too much money damages the bystanders, so to speak.
People who can't solder together a usefuly TTL circuit (say a mux for the parallel interface to drive a LED display or to control a robot, who don't know that LL(1) and LR(1), these are simply not "Computer Scientists". They are the products of a Funny Show.
And don't tell me that "industry does not need people who know YACC, they have an urgent need for DOM/XML". If you don't know the basics of computer science (and that includes scanners/parsers), but are loaded with the currently fashionable Project-Management Blabla and the currently fashionable Data Format, you are a Monkey, not a "Scientist".
So many universities nowadays create Computer Monkeys, or should be call them "XML/Java/Hibernate Monkeys" ?
And yes, I know how to calculate how many liters of petrol it takes to accelerate 16 monkeys into space.
Are you referring to the program that I provided students to test whether their implementations of algorithms were correct? They found that very useful.
Maybe you are posting anonymously because you were amongst the 1 in 4 students who cheated on their coursework by copying. Or perhaps you objected to being asked to think, which was the point of the original article.
/* RANT
The grads we employ tend to be one trick ponies, they have a capability that it is up to the employer to exploit, however quite often they want to stay in the Java or .Net or Ruby camp because they have a tribal loyalty to it, and will turn down opportunities to work in other technologies.
Grads, learn as much as you can about IT, and welcome it, don't runaway just because you need to learn something else. IT does not begin and end with one technology, there's one hell of a lot of difference in performance between updating a database on your PC, and entering a transaction on the web which then passes through a security gateway and SOA bus or two befor hitting an application server that actually stores the data on the database.
Learn just what those interim systems actually do as your transactions pass by, what happens when it all goes wrong, oh and by the way, learn Basic, Java, C anything, PL/SQL, COBOL, not hard Learnt them all in the first 8 years of my IT career, (Well not JAVA), alongside PowerBuilder, the Last one, PL/M, Fortran five dialects of Basic, Neat/3, DCL, Unix Shell (3 Dialects, three vendors each), oh I also learnt Business Analysis, Database Design, Systems Design, god I could go on.
Oh and Mr IT Director, train your graduates, rather than bitch about what they can't do.
*/ RANT
So many points to pick at, so little time today. Lets start with "OMG THEY TEACH SHIT IN JAVA!".
Good. I'm glad. Language is an implementation detail, when you need to learn the fundamentals of something, you don't need implementation details biting you in the ass. Want to learn the fundamentals of OO ? C++ is not for you. you'll hurt yourself. Learning it is hard(er) and most of the language has fuck all to do with the OO paradigm. Ditto OS internals.
I certainly think that CompScis should have at least a passing familiarity with assembly and C, but these are suited to modules covering machine architecture.
Which brings me to my next point. CompSci is not, ought not to be, ought not be expected to be and ought not to be taught or promoted as though it were, a good choice of degree for people who want a career in IT or who want to be programmers. 90% of typical workaday IT stuff is tedious drudge work, 90% of programmers will never be asked to implement (say) a soundex alogrithm or a quicksort algorithm, never mind mind analyse one. And the proportion of codemonkeys who will ever need to implement an OS, a compiler, a VM as part of their day job is vanishingly small.
The clue is in the name, Computer _Science_, is a good choice for people who want to be computer scientists. Or it ought to be. That dead eyed look, mopeyness, lack of interest and social awkwardness the author sees in CS grads is most likely because they thought they were training for a career doing all those neat things but once they hit the recruitment stage find out out that they've actually qualified themselves for just another boring IT job. They may be sitting in a cubicle in a merchant bank earning 100K (before tax, don't even get me started), but their day to day existence will be no different from the VB monkeys in the insurance brokers down the street or the VBA jockeys in the small office round the corner.
What a fucking waste, right ? So yeah, it probably should be harder. But on the other hand, neither should the author be trying to hire CS grads to IT positions. If you want to work in IT, go do an IT degree - of which there are many - or even better do a Business and Computing degree. It will be more use to know about profit margins, accounting practices and management techniques than compiler backends.
I'm a first year CS student at a middle-range uni (poor A-levels after I get bored out of my mind teaching my fellow-students VB because that's all our school taught and our teacher didn't know it), and I've been disappointed to find out the truth of this article first-hand over the past month. We only seem to learn Java, so I'm trying to learn C++ in my spare time.
I consider myself a geek, and love playing around with all things computer-related - always have. Does anyone know what a poor hapless guy like me can do to avoid falling into these traps and becoming the kind of grad fretted about in the article?
"And once you understand pointer arithmetic you can conquer anything computing."
Yeah, but try doing triple indirection*** after half a bottle of cheap Portuguese pish from the corner shop and you'll see why PHP was invented (and coincidentally, how it was coded).
*** fuck you, COM, seriously.
From the students I've known those that choose Maths or a science (Chemistry/Biology/Physics) know far more about computers and, programming than anyone I've met who does a Computer course. Which is a pain the arse as in my company having a comp degree == an extra £3k+ on your starting salary.
So my year back when was the first to go totally java because it was hip and cool and similar lies. I flunked and washed out partly because even back then the university course oozed stupidity. So I became a unix sysadmin, and I did a bit of big open source project core development, some consulting, and so on. Still no recruiter will talk to me.
Why? Well, if anecdotal evidence of someone replacing his comparable CV with a fake that contained the crappiest imaginable qualifications and a between-the-lines spelling of "incompetent" causing a deluge of recruiter calls is anything to go by, having a selection process depend on recruiters is a good way to filter out the people that can, in favour of having some sort of formal qualification regardless of what that actually means.
Thus we turn even big name universities into diploma mills. It's what the market "wants".
And my experience mirrors that. So I have worked with about half a dozen unices not counting "distributions", and running exactly no micros~1 at home, and still I get some halfwit recruiter gal tell me "not enough unix". You'd rather have an MCSE then, wouldn't you? I'm sure you would. This seems to be worse with the bunch that claim to specialise in "IT recruitment". Clearly getting the tech terms wrong and writing misspellings into a job advert requiring "excellent English" is just that much more icing. I'm not a native English speaker yet I spot error after error.
Note also, tangentially, that it's the Unix crowd that traditionally has a much stronger affinity with the written word than found in certain other areas of computing.
Beyond the clue deficits, I also get straight-out lying. "You failed to submit your CV in word" for a Unix-only job where the advert failed to mention "word" at all in the first place is one of the more benign lies I've seen. No, I don't care that your industry breathes proprietary crap. I filter against that so I won't spot the advert if you're at least being honest about your misguided expectations. Fscking up my CV into your system is _your_ job, not mine. If you want good Unix people, you deal with open formats. Require them to use micros~1 software and you get what you deserve.
All it did for me was gain me a deep-seated boiling hate for all things recruiter-y.
If the industry truly wanted good people it would put effort into finding and nurturing them. It would make it pay to be good and not pay to be mediocre or worse. If you rely on the clueless&witless to do arguably the most essential part of management for you, then your acts belie your claims.
Ah gotcher sympathy roit heer, guv.
Singapore: I get resumes from all over the place -- including local folks -- that are poorly formatted. To me, that's a serious problem when I'm hiring a 'jack-of-all-trades' technician. Word makes it darn near impossible to take control of formatting, but the doofuses who apply for my positions don't seem to be able to find any alternatives.
Worse still is the number of people who, with a perfectly straight face, tell me that "internet surfing" is one of their skills.
And I once tossed an otherwise horrible resume because the guy used a @hotmail address.
I always ask the applicants, "What's the weirdest tech stunt you've done with a computer?" If they can't come up with an answer that lights up their eyes, they go to the end of the line.
Back in my day, we did this work because we loved it. Heck, we HAD to love it in order to suffer the indignities of the crap we used. These days, it's all kids thinking it's a job -- nothing but a job.
Sheesh. In the absence of an "old fart" icon, I'll just have to have a beer.
I had to move some mini tower components into a bigger case so chose a beastly PowerEdge tower chassis for the new host...
Half way through removing the components i realised I had no mounting screws for the mobo and through my Macguyver instincts I drilled bigger holes in the chassis and mobo and used some server cage nuts I had lying around!
The machine worked for a month before dying a overheated death... that was a good weekend!
This post has been deleted by its author
In my experience of recruiting programmers, around 90% of candidates at interview cannot implement a simple algorithm in code on a blackboard. For some, they just do not know the language, so cannot cope without inteli-sense. Some just do not understand how to create a basic algorithm and then translate it into code. That's candidates claiming 2-3 years experience. Graduates have other additional weaknesses, not just in coding skills. they will typically not have experienced using a debugger to step through code, they are unlikely to have used source control. Any knowledge of unit testing will be theoretical.
Then, when you do find a graduate that's worth employing, you will just be one of 10 job offers that they receive. Thankfully, the HSMP visa program meant we could import the developers, rather than outsource the development.
Left school with GCSEs, taught myself to program C, C++, Java, Perl and SQL. Local colleges wanted to waste my time teaching dBase at A-Level CS and no real programming. I dropped out for 2 years and worked as a groundsman for a sports centre!
Finally got a break installing Novell Netware ( at sports centre! ) and worked my way up to become an Oracle DBA. I see so many planks who can barely spell C let alone write any form of structured code in any language, even shell. Granted being self taught you pick up nasty habits, but at least my CV shows drive and determination to make it happen for myself and not rely on a bit of paper to get my foot in the door. It's been tough, no letters to my name but I have had the good fortune to find people who have seen through that, gave me a chance to prove I can work in IT.
I have always found that learning a new language (or re-learning an old one) is the easiest part of starting a new job - learning the ins and outs of a new business is usually much more challenging.
I came out of Uni with a good degree and (in my opinion) a very broad set of I.T. skills, and have gained fairly diverse professional experience since, yet in the past I have felt that I was rejected at an early stage in the recruitment process because I didn't have specific experience in something trivial (e.g. VisualStudio). I can only imagine that this hurdle is even more difficult for new CS grads to overcome!
Maybe there is too much emphasis on hard skills and not enough on general aptitude?
As programmer I agree with thsoe who say that choice of language is irrelevant, I spent last Thursday programming Powerpoint in VBA for money as it happens.
But as a headhunter I care a lot about how smart you are, and being "as good as most other people" is a bad place to be (been there, didn't like it), even in good times. Java does not provide evidence you are smart. I was one of the earliest people to write code in Java (and VB and C)., *that* was good evidence because there wasn't anyone to help me.
I half agree with those who say CS courses are not supposed to be prep for jobs. To me the best prep for a good career is understanding WTF is going on, and to be able to quickly grasp new developments because you understand the core idea that permeate your subject.
As it happens I don't only target people from "certain universities", but have my own more complex prejudices, which quickly filter to <5%
A couple of people blame recruiters, and as the person who coined the term "pimps" for agents on CIX in the 1980s, I am happy to spread the blame for many things in this direction, but I genuinely can't see why you should blame us for this particular problem.
So I have an IQ high enough to get into mensa--from several properly administered in-person tests evaluated by a proper academic psychologist, not the "from the internet" crap. So I have an email archive filled with several hundred recruiters that never even replied, often enough even after mailing _and_ calling to have a bit of a chat, or if they did they simply said no or lied their pathetic little lies, sometimes provably so. No sir, recruiters simply no longer do.
Instead, get yourself Linux, GCC and a decent book on C++. Start an open-source project which does something you deem interesting.
Maybe a secure telephone system for Android (using NDK) ? One that can't be rectalized by the gubmint ?
Then look into job databases yourself and talk about your OS project. The project will give you *real* exposure to systems development (including debugging nightmares and the creativity which follows from that). Experience is what employers look for. And don't believe those Java fanboiz that "all new stuff is done in Java". If you want to do something revolutionary you need to use C, C++ or Assembly language. Get your fingers dirty while your CV is being processed by potential employers.
If you don't like to go the hard way of developing and debugging a 20000 line C++ program, you better go flipping burgers or become a cleaner.
Learning "Project Management", "effective Presentation" and painting nice powerpoint slides you can learn later. Don't believe this "soft skills" shite.
Certainly "commerical development experience" has more weight than "open source development experience". But both are *much* better than "little development experience".
Also, it definitely helps your career to do an OS C++ project if your job exposes you only to PHP, Java or C#. Strategically speaking.
Many moons ago I was senior dev at an ISV and I needed to recruit a new codemonkey. Fairly standard process, get the candidates in for an interview then sit them down with the team for a tailored aptitude test.
One candidate sat down with the aptitude test and ignored it. He proceeded to program something else entirely because he didn't think that the aptitude test we'd given him would be enough to show off how smart he was. And boy was he smart, according to him.
Funnily enough I didn't hire him, because while smarts are good, so are not being a dick and doing what I fucking tell you.
"But as a headhunter I care a lot about how smart you are".
That, and nothing else. Note that you are not a recruiter, not even a "hr person", which is fairly important in context of the argument. Though you do strike me as a tad obnoxious for having missed the salient point, intentionally or no.
When I were a lad, there was no "CS". I had lectures on "Logical Design of Digital Computers", wherein we learnt Boolean Algebra and how to simplify equations. Our first programming projects involved using machine orders (in binary) keyed in using front panel toggle switches, and advanced projects were done using assembly language. Memory management? Ha!
Passion for the work was a must, otherwise the tedium would drive you bonkers!
Does anyone study Venn diagrams any longer? Do any here know what they are? The world is going to hell in a handbasket. Sorry if I sound like an old geezer, but such I am.
This is one of the best articles on the Reg in ages and I completely agree with Dominic.
I came out of Uni in 1993 after studying Comp Sci with a solid grounding of C, ADA, shell, 8086 Assembly, Cobol and SQL under my belt. As a newbie grad I soon found out I had a lot to learn but never felt out of my depth when I started work as I could quite easily "talk the talk".
If all they are teaching these days is Java then I'm not surprised that these Grads are unemployed.
It would help I guess if Universities actually employed lecturers with an understanding of modern life, rather than legacy lecturers that have been hiding behind their desks for their whole careers.
Speaking from the student side of this debate, I think one of the best points being made here is how poorly the pre-university education system prepares people (damn that's a lot of ps) for actual computing work. My school was billed as a "technology college" and yet the number of computing-specific courses was zero. The closest A-level equivalents were in Electronics, which did involve the basis of digital theory but little more, and Business. Now there's an appealing title, especially to geeks [/sarcasm]. Not only this, but the careers advisors were useless, I doubt any of the students who were enthusiastic about the paths they chose derived any benefit from the careers staff.
I am curious however about the numbers here, and wonder whether the OP is referring only to courses specifically labelled "Computer Science" or whether he has investigated other computing courses. I'm currently studying Forensic Computing at the University of Central Lancashire; in our first year we were required to do basic C++, PHP and a little SQL and could choose either C# or further C++ for the second semester. In the second we could choose between OO-based C++ (until that point it had been entirely procedural) or SQL (Oracle based). Furthermore the networking module has covered well beyond the understanding of the difference between switches and routers already and this module is required for quite a few of the courses.
Granted, the majority of the students chose SQL because it was seen as the easy option, and not because they were more interested; but the general outline of skills learnt does not match up with the poor understanding displayed by the graduates he has spoken to.
On the other hand, the fact that I achieved the second highest mark in the year for both the C++ and PHP/SQL modules in the first year when I didn't even enjoy C++, and had never previously touched either at all does seem to suggest that whilst being taught more broadly than the examples that have been the despair of the OP, my coursemates may not be a shining example of all that is best and brightest in British universities.
Back in 1999 I was dealing with recent Comp Sci grads who couldn't copy files without hand holding, couldn't customise their desktops in Windows or Unix, etc et bloody cetera.
It's not a new thing at all.
On the other hand I know rather more recent grads that can, and do, walk the walk......
Algorithms have nothing what so ever to do with programming or programming languages.
In the past ALL software was designed at the algorithm level by a System Analyst and he or she handed the lowly Programmer the algorithmic rules -- to be implemented in any language -- it mattered not a jot. What DID matter was the Programmer had no scope for creativity or deviation from the rigid and working algorithm(s) -- usually worked out with a pencil and paper on lots of very large flowcharts with a clear understanding that there IS NO QUESTION that cannot be reduced to a descending tree of YES NO outcomes. If you CAN'T do it this way, you're simply asking the wrong set of questions. This was always a golden cornerstone of CompSci. Sadly, not any longer.
The result was software that still works forty years later on very large scales (think Air Traffic Control systems that 'Actually Work' instead of the awful useless shit churned out today).
The reason the quality of software has declined is because the role of Systems Analyst no longer exists. Programmers are left to devise the 'rules' themselves as they see fit. They are usually the people least suited to do so.
I've been both so I've no axe to grind.
I have been trying to get this across at work for ages, that there is a massive difference between a programmer, and for want of a better word, a developer... someone who 'creates' solutions :)
A programmer, from my point of view, is a code monkey who needs to employ little to no independent thought, they receive a spec/design that tells them what to write and they write it. Hence the increase in off shoring "minimal skill" roles to newly graduated Indians. Unfortunately, a lot more is needed.
It's the skills required to correctly design and structure things where it gets sketchy, where I work, barely any coders are competent enough to do the technical structuring that I'd consider a basic skill for a programmer. Never mind the higher level analysis you mention, which seem to now be a thing of the past. I don't care what language you know, if necessary, I'd expect you to be able to apply your skills to any language required, with a minimum amount of training, as the bulk of the skills should be language independent.
There will always be a place for people who know the intricacies of a language and it's compilers, inside and out, to the lowest level possible, but it's a highly specialist role that is completely unnecessary for 99% of applications. And, from what I've seen in the past, can even be counter productive if something is done is such a 'clever' way, than no-one else can easily amend it, if needed.
I think it's an unfortunate side effect of IT, things tend to go in cycles. The improvements in the tools, and the simplifying of the whole process seems to have taken us right back to the early 'cowboy' days of just sit down and cobble something together that mostly works! Except it's no longer just the elite doing it, it's any monkey with a typewriter!
Re" Even then, about a third of the newbie CS CVs I get contain tragic spelling and grammar errors. "
And we have an internet that sponsors and promotes bad grammar and spelling, i.e. "pwned" and all the rest of the pitiful examples of just how low and irrelevant a really good education is now days. Or so it seems. Looks to me the "dumbing down of America" has spread across the water.
No matter, let those who can think ... think, those who can learn ... learn and the rest will be digging ditches. We need ditch diggers and lug nut spinners. And from my years of teaching I can tell a lot of potential "diggers" are out there.
while I would have to agree the majority of my fellow graduates of the class of 2010 could not write a simple program to count from 1 to 10 in even the easiest, albeit bloated, languages like Java, I was quite honestly insulted at the way this article was written which made it out that not one student graduating in the last couple of years had an ounce of programming skill or geekiness in their bodies.
I realise I am speaking for myself here but I believe I have an abundance of both. I graduated this year from a top 10 university and while we were forced down the route of java and spoon fed the syllabus that didn't stop me programming for fun and teaching myself around the subject. You seem to be taking the view that the majority rules and there are no diamonds in the rough as it were. I have found no trouble in finding work and infact had a few offers ranging from funded phd's to well above average paid jobs in a weak economic climate. I decided to take the job as I wanted to get out of the stale learning environment in the UK Universities. I have since picked up python in the 3 months I have been there so much so that I am working on the front end of a highly secure online transaction gateway. I have had no problem and enjoyed the transition from university to the workplace.
I do believe I am in the minority in this and realise that a lot of people leave university with no understanding of the algorithms and techniques that make a good programmer, but the assumption that because you see 100 bad graduates that there is no good graduates is incredibly short sighted.
As for lack of motivation or creativity to do our own projects as graduates again you are assuming that there are none out there willing to do so. I disagree, a fellow graduate and I have been working on our own project alongside our jobs (yes another graduate who walked into a job who is more than competent at knocking a few lines of code out) that we believe to be of great worth. There is no question that we are in the minority as I have not heard of any of my other coursemates following suit but there is not an abundance of this in any society. There is not a Bill Gates, Mark Zuckerberg or Lawrence Page in every graduate year, but that doesn't mean there aren't people trying. One of the biggest hurdles for the UK graduate that is going for these heights of glory in the industry is that there are very few opportunities given to us to really push forward. The access to funding is just not the same as it is in the US and quite simply to be that successful in this industry it takes luck. Bill Gates was ruthless in business, but was lucky that he had access to a computer at all, Mark Zuckerberg saw a very small but ever growing niche in a highly populated social networking industry, and the guys at google well they just had a good idea for searching through some pages didnt they. But there was a lot of luck that made them as successful as they are today.
All I am trying to say in my long winded way is that assuming that there aren't any geeks that program for the fun of it and are doing quite well out in the real world after graduating in recent years is short sighted and at least in this, self stated, case correct!
Most graduates need breaking in. Get over yourself.
Anyway strikes me that the real reason that loads of Comp Sci graddies are unemployed can be seen if you stand outside the bus stop at any business/technology park in the South East of England as the intra-company transfer people arrive. If it can't be offshored then UK plc ships in someone on a dodgy visa.
While I agree that there is a huge gap between "what western folks need to survive & pay off loans" and "what companies are willing to pay," I think you are a little off base. The gap leads to outsourcing, the outsourcing leads to high (local) unemployment. This is a completely separate issue from the one under debate.
The issue under debate is quite simply that you can’t teach people to /think/. All the fancy degrees and training in the world don’t make someone creative. They don’t make someone capable of colour outside the lines and being inventive. IT isn’t – and in my opinion should never become – simply a matter of applying the same tired old solutions to the same tired old problems. The instant you start sysadminning by whitepaper or coding “by the book” I can replace your job with a properly coded shell script.
The problem is that computer science education as it stands today is /hugely/ dogmatic. It drives away creative types. Think outside the box? You’ll be ridiculed until you fall in line. This means that folks who might have had a lot to contribute – those people who learn by experimentation and making the bloody thing go blork a few times – are discouraged from providing the benefits of their eccentricity.
No matter how many greenhorn monkeys you employ, if none of them can think orthogonally they quite simply have no value. The “push the same button, receive the same result every time” set are the digital equivalent of assembly line workers. They are a half-step up from your local Zellers cashier…and even that’s debatable. At least the Zellers cashier has to be personable.
Credentials are nothing. Experience and the desire to experiment are everything. Well…at least in my world. It governs who I hire. Sadly however, the majority of the world is trained in the standardised HR propaganda machine which would agree with you rather than me. In that world, credentials are everything. Experience, creativity and inventiveness simply don’t fit into the standardised checkbox mentality. Odd however how companies that follow that rigid hiring procedure always seem to either be governments or on a slow sliding decline away from innovation and into the patent-lawyering abyss…
Seems strange when the Big Money goes to the Big Geeks (Bill Gates as the über-example, but then that Facebook guy and so forth) that students in that field wouldn't put together a reasonable collection of competences. But, then, maybe they think these things only need you to have a good idea and "put together a website" without thinking of what actually runs this site?
At school, I signed on to do CS. Ended up doing RSA IT (Clait? something like that) because the teacher wasn't up to CS. Hell, I spent half the lessons I bothered to attend fixing up the network and the computers because with the lid off, he had NO idea. And we went on outings... to look at cash machines and a big cooled computer in a Sainbury's. All of this was powered by a database. No shit, is THAT all IT is? How's about we back up and explain what a database is, how it works, methods of sorting and index.... oh, wait. That's all a bit CSy isn't it?
Perhaps the poor level of teaching is because the modern teachers are last year's unemployed grads, a cycle to be repeated until nobody gets much beyond looking at a computer and saying "woo, clever...".
But, hey, who listens to me? I'm a self-taught geek. A guy with a passion for binary and ARM code, who goes and learns his own thing of his own free will - so often portrayed as an anathema. Oh well.
I recently have left Kent State University do to the fact that trying to take classes for C++ or even C# is nearly impossible or frankly not offered at all. When I was in high school all about 10 years ago, Kent had a Technology program offered at my high school. In that program I got my feet wet with some VB and C++, I later fun-off to go join the military and when I get back and want to put that GI Bill to good use. I enroll in Kent thinking that "they seem to offer what I want", I get there and they have NOTHING! Everything seems to now be Java or Python, with VB for the "over achievers", this from a college that proudly claims to be a "leader in technology". Needless to say, that yes some (if not most) Com Sci and Tech students are lazy here, it makes it impossible for those of us who actually want to be programmers. Ultimately, as long as most of the students want to "sell themselves short", the academic community here isn't going to offer these classes, it's all about the money and if there's no interest in these classes they're not going to offer them.
I'm not studying for a Phd in CompSci. I am, however, studying a Foundation Degree in Computing and hoping to go onto a BSc (Hons) in Computing. So far I've got a harware and networking lecture, a software dev lecture, a HCI lecture, Learning in the Workplace and Work base learning and then _two_ (utterly soul destroyingly boring) Project Management lectures. The two project management lectures are consecutive lectures too.
I can see that I may well end up not using anything from HCI and LiW/WBL, even if Project Management does end up being useful, but I wouldn't mind doing these modules if the lecturers actually made the lectures more interesting.
Also, I'm probably one of four, maybe five, doing the course because it's actually something I'm interested in. Everyone else seems to be doing it because it's easy or there's potentially a lot of money in it. Suffice to say that school was only interested in getting me into sixth form to get A levels. (So much so they signed me up even though I was actually already at college)
As yet, I'd agree wholly with this article. (And I'm not a big fan of Java either)
Its the 'math' bit of the Math/CS degree that's worth the money....
A surprising amount of teaching it stuff that tries to pretend that you can get by without math. The rot's spreading as well since a lot of the teaching profession managed to avoid math (so they think nothing of inflicting essay questions on math students but can barely add up themselves).
I have to agree with James Simpson. The tone of this article may highlight some interesting facts but decrying pretty much all CS students is a tad over-zealous.
I'm a student myself and I'll be graduating in 2012 with a huge debt and a shrinking job market to look forward to. Fact is, most of the current students reading Reg aren't going to be the average IT student who doesn't learn a thing outside of the syllabus.
Most of us know that what we are taught in the first year will probably be deprecated by the third and totally useless by the time we get a professional job and react accordingly. Those that don't... well, those are the ones that remain unemployed.
...you belong to those who don't understand the concept of a "foundation of a science". Look at this:
* Hashtables
* Balanced trees
* Relational Theory (SQL being a expression of that)
* Procedureal Programming (including that latest twist called Object-Oriented Programming)
* Functional Programming
* Boolean Logic
* Finite Automata
* Scanners
* Parsers
* Logic Programming Methods (Prolog being an example)
* Thinking in Data Structures, translating problems to Data Structures
* basic algorithms like Quicksort
* Math (integration, differentiation, statistics, graph theory, number theory)
* Electronics (Transistor, Diode, Capacitor, Ohmic Resistor, Coil)
* Basic Physics (Newton, simple Quantum Physisc of Semiconductors etc)
* Methods of Solving Equation Systems
* Vector Math for 3D
All of that was valid in the 90s, it is still valid and highly useful in my job as a C++ developer TODAY and I guess it will be useful to me in 30 years time, when I go into retirement.
If most of these concepts are alien to you, you don't study computer science, but you do Monkey Training.
Firstly, fads come and fads go. Java has been quite an enduring fad. Second, employers have been *demanding* various fad skills from the universities for years (it used to be C++, then Microsoft development skills became important and then Java).
Now, worryingly, if there's any truth to this article, it looks like many universities are not teaching problem solving from the ground up anymore. By this, I mean starting from first principles, breaking problems down, learning about algorithms, applying them to Pascal and then building on these skills by solving problems using different languages/paradigms. I'm not saying things like C aren't important (C/C++ in particular are important because they teach the value of cleaning up after one's self and OO principles that apply equally well to Java, C Hash and so on).
Alongside all of this, students should be learning the principles of analysis, design, databases and so on.
Employers want the latest fad skills when what they really need is flexible people who can apply a set of more generalised skills to whatever problems the employer has in mind and quickly learn whatever tools and technologies are required.
As for little things like the ability to string words together coherently, I witnessed first hand that battle fought and lost in the early 90s just before I graduated. At the time, we still had a couple of lecturers who'd mark us down for not writing clearly or not having the common sense to use a spell checker. Talk about a couple of lonely voices in the wilderness!
http://www.theregister.co.uk/Design/graphics/icons/comment/stop_32.png
There are a lot of people complaining here about the state of education and yet every day I see people with 20+ years of experience who are no better. There are very few programmers of any age who can tell the difference between a lambda and lamb kebab, how to simplify a boolean expression or describe basic type theory and why its important.
Furthermore, its not just about the languages. If you don't know how to do TDD, continuous integration, communicate with BAs, clients and your peers, plan, keep learning every day, etc etc., then you're screwed.
To all of you, have a look in the mirror, brush your hair, and figure out how you're going to learn something today.
"The vast majority of kids look at the subject and quite rationally assume that it is both dull and useless."
Well... if you're doing a good job with your software design, it probably is fairly dull. People who try to make their code interesting make nightmares for the rest of us.
On the other hand, I accept and agree with the overall point of the article. I love my work, even the exciting parts.
How many hours of actual hands-on coding are done in the course of an average comp-sci paper?
Compare that to how many hours actual hands on coding are done by a programmer in a typical month. It should be obvious why graduates can't code - you can't learn to program well with so little experience.
All the best programmers I met at university learned to code themselves, outside of their coursework.
Don't you actually feel sorry for the young these days? I mean imagine trying to get interested in cars. No, you're not allowed to clutch start for fear of destroying precious electronics. Try and find your way to the oil filter in a modern engine. And taking the dashboard off? You're more likely to accidentally trigger the air bag!
Computers aren't the same as the 80s. Back then a microprocessor was digestible even by a child. One ALU. A few registers. A clock speed you could comprehend.
These days CPUs are unfathomably complex affairs. Protected mode, memory management units, hyper threading, MMX registers, pipelining; not even a hardware engineer would hope to leave university fully understanding a processor now - you need large companies where everybody does their thing then brings it together on a single die (or multiple).
As a child where do you start? A homebrew kit of Microchip or Zilog embedded processors? Certainly not your home PC. Quickbasic is gone.
I just feel sorry for today's youth. It must be so difficult to really get in the game now.
I'm a Comp Sci grad (2007), who managed to get a job at the Uni, developing software.
So... the full story.
A Level Computing: Binary, and how to write VBA for Excel and Access. Mind-numbing. For a laugh, a few friends and I did a 'security audit' - back in the good old days of Win 98 caching pwd files on the C drive - going around, collecting, home to Cain & Abel. We'd 'rent' an account out when someone got banned. Finding ways around the 1st generation web proxy software. Hell, I was a mod at a commercial adult chat at 16, and needed to be able to do my job from college. At the end of College, we handed the password list in and explained how we'd done it - these days, that would probably get us brought up on criminal charges.
So I play for a few years, teach myself VB6, try and get a job - and not a single company would take me on, due to my lack of a degree. Off to Uni we go.
At Uni, they're teaching Java. Year 1 consists of lots of lectures telling us "So, we have a Animal. An animal eats, and sleeps. Here, we have a Lion. A Lion is a type of animal. It Eats(), Sleeps() but also Roars(). I shit ye not, this was my main CS lecture year 1. Despite being one of the more techie (at least having dabbled with C, and knowing VB / VBA), I hadn't got a sodding clue what Java was about. We'd done no programming, half way into the second year. One weeks work might be "Make a program that adds two numbers", and we'd be given a program that subtracted two numbers as an example code. People would hack, copy, paste, compile, and submit. No one had a clue what the {}'s were about, what the imports were - it was like the lecturer was talking baby talk.
One night during an all-nighter on coursework with mates, my mate handed me the database connection class he'd made. It contained crap-loads of program logic code, bless him - so we agreed to take it out and put it in a class of its own. Whoa... wait, what are we doing here? This.... this is refactoring! We learnt more in one night about Java than the proceeding 1 and a 1/2 years.
I made it my goal during my placement to learn C#. I figured if there'd be 1,000 crap Java coders in the area, having had a look at the job market, I'd have a crack at C#. Was playing with Managed DirectX (before MS sunk it) and writing my own WAV file spec classes by the end of my placement. Had a wail of a time.
Now... I work here. I see the students. Many are hopeless. Appalling. Lacking any interest in technology. They are not geeks - they're glorified IT office workers.
As for the lecturers... ha. We have our "open source" camp, who insist Java and Python are the only ways forward. Microsoft technology is not taught at all, as there's "no jobs in it". And besides, MS are "Evil". We have maybe 5 - maybe - good lecturers. The ones who are good, are amazing. Superb. Proper geeks, doing multi-dimensional geometry in whatever language you want. The ones who are crap just about know PHP and teach the entire web syllabus, for all students. They prepare materials one year, and can't be arsed to work on it for the next 5 years. They're lazy. They're not geeks. They are shockingly poor. But they brown-nose, and do the political bit. So they move up the ladder.
We teach IS concepts that are 10 years out of date, as no one can be bothered to learn a modern visual modelling program. We teach students SSADM, which Wiki tells me hasn't been updated since 1995(!). My bad, 15 years out of date. When I actually did my degree, I was given the entire modules notes for VB6, but the project was to be done in VB.Net. "Here is the WinSock control.... er... Professor... does VB.Net actually have a WinSock control?" Oh I see, you couldn't be arsed to update your lecture notes. The long summer isn't for preparing material, it's for a long holiday abroad - I got it.
We buy in Macs. No one knows Objective C, or Mac native development, but "Apple are big right now - and they'd look awesome in the lab for marketing shots".
The students, on the whole, are lazy, none-technical, rude, ignorant chavs. The lecturers, on the whole, are over-paid, under-worked pre-madonnas who can get away with anything. The less they do, the less they're expected to do. The few who know the subject inside out are expected to pick up the slack, until they burn out through being over-worked. The handful of students who are superb - and I mean truly gifted, learn whatever they can outside of lectures, and complete assignments just to ensure they get their piece of paper. They will succeed DESPITE our degree course, NOT because of it.
As an aside, I applied to do the MSc. I got to re-learn binary, 10 years after my A Levels, as the only Masters offered is a 'conversion' degree, to cater for the huge number of foreign students who can't speak English, much less operate a PC. As long as they pay £10,000, we'll say they know what an on-switch looks like. When I walked away from the MSc, the last piece of coursework I was asked to complete was "If you were operating a travel agency, what technology would you take to foreign countries, and why?". Yeah, let's play writing a story. I assume colouring-in gets bonus marks.
</RANT>
I love computers... I love the talented lecturers, who have helped me to grow. I love the gifted students, who have asked me questions I've had to admit I will go away and find out an answer to; bloody good question, I'm curious too. I love the talks I've had about the Amiga, the Z80, the positives of Android, whatever - the stuff that engaged me. But that's all I love. The rest is a sham.
...but upvoted as I agree with your post. At my school (a billion years ago when the 6502 was king (and the Z80 a runner-up, sorry!)) I wanted to do CS, ended up doing IT. "This is a cash machine" and rubbish like that. I did okay on my IT exam, I got a B. Would have been an A but I was missing a fair bit of coursework on account of not bothering to attend the lessons. Started teaching myself ARM code and basic programming principles (back in the '80s, many home computers only supported procedural programming by way of a mess of GOSUB/GOTO so a lot of the home market books were not very helpful with the advanced feaures of BBC BASIC). I still write code for a hobby today, and while I could no doubt train myself up to the "modern stuff" and get a better paid job than I have right now, I'm wary of taking an activity I love and destroying it. There is something zen-like in killing the OS, setting up the interrupt table, and bringing the machine from the ground up. I plan, over the winter, to fiddle with my PVR (TMS320DM320 - ARM based jobbie, but I'm really pissed I had to snarf the datasheets from a Chinese site - jesus, TI, what's with that? don't you want people to support your chips?). Why? Because. Because in my screwed up mind that sort of thing is actually enjoyable. I guess it would take a very special teacher to understand this. Pity, I've only ever met one. The rest? Like you say, couldn't be assed to update stuff written years ago. Couldn't be assed if I was interested, or not even there. Makes me wonder why they bother.
It reminds me of various "we need IT ppl" adverts of years gone by.
I'm someone who did study comp.sci. at university but dropped out after trying to get various entry positions advertised by 'recruitment' agencies did little more than get a list of tech buzzwords from the press/employers and try and match them to the people signed up to that agency ( I gave up after seeing an ad for '2 years .net programming experience' when .net had only been out 6 months )
As for the Java being crap thing, its a reasonable OO langauge and does'nt have the pesky garbage collection problem of C++, and as posters have pointed out, (and was pointed out to me at uni ) the langauge does'nt matter if you understand the core concepts that are being taught.
Still I ended up doing IT programming work of some sort, although I'd hardly call programming industrial robots exciting.. or challenging.
Give me a nice Z80 assembly program to debug, preferably one involving interupts and timing problems.
or perhaps some C or C++, or even some nice serial port programs written in Java.
Boris dip.comp(open) Bsc(hons)(open)(failed)
A CS graduate ought to be able to learn a new language as necessary, and should have picked some up along the way--I mean, unless you only majored in this because you heard you'd get a nice job, rather than because you were actually interested in it.
CS is, however, a mathematical discipline. It is not the same thing as software engineering, even if some departments try to cover both. If you don't like automata theory or lambda calculus, then change your major to something that isn't CS, because that's the kind of thing you should be there for. The idea behind CS is not learning how to write code. That's something else. It's about learning, on a theoretical level, how computers work and how to figure things out about them. Now, maybe a lot of CS departments don't do that. Hell, maybe most don't do that, in which case most of them suck. But that is what they are for.
Every single programmer knows programming language itself is really not a show stopper. It can be picked up very easily. That being said, this article sounds lot like MBA talk rather than from an actual programmer. References like all the Indians knowing Java and hence taking over IT, CS syllabus being obsolete etc. proves that writer is not capable of big picture and maybe racially motivated. Anyone can tell that Indian IT resources (outsourced or in-house) are cheaper and abundant. As per quality of work, usually you get what you paid for. I realized that most of the syllabus are actually intended for building foundations and proper follow-up courses can actually help students understand both the theoretical and practical aspect of the IT. I also agree that not all schools has good teachers but again at this day and age, people are really not restricted to their curriculum and shouldn't expect everything to be spoon fed.
I started working for a major defence contractor in the early Eighties. Being both dyslexic and dis-numerate, I was forced down the route of being an autodidact for all my eduction. I was recruited to the company from a supplier. I was producing working code to spec. within 2 months. I watched graduates arrive with their shiny degrees and monumental but useless knowledge of Pascal struggle to produce anything usable. Not because they where intrinsicly incapable , but that academics had formed their education. At this time most of the computing industry was either using or looking to C for higher level programming and assembler where necessary. Their entire out look on programming was down the wrong end of the telescope .
In the early Naughties, I worked as an engineer on an information CD-ROM with a postgraduate designer studying multi-media. We could have been working in pastry production and deep sea diving for the amount of relevant skills or knowledge she bought to the project. Again It wasn't lack of aptitude, but more the slow to change curriculum she was working with. The need for the course structure and content to pass through the Byzantine approval procedure almost guaranteed its obsolescence.
A theoretical knowledge of programming and computer engineering is as useful as a degree in comedy to a stand up.
Has come from graduates with all the buzz words and clever ideas but no idea that writing clear, well commented, clean code is better for everyone.
Not only is it so 'clever' you can't fix it but its so clever it doesn't work.
The BEST programmers I have known have not had a degree - far less one in computing (sorry I have a degree, but not a computing one - so I guess I'm half way to being good :) )
The problem though is with the recruiters. You were lucky to find someone with a brain, very very very very rare in the recruiting, HR or middle management layers of the UK.
I did Computer Games Development at the University of Luton (and graduated from the newly formed University of Bedfordshire). Sounds like the most scoff-at-able course in the world. However, we only did Java in year 1 (and now it is all actionscript for new students) and ended up using C++ in Year 2 onwards (with some C#).
Although mostly unemployed since graduating in 2009, at the moment I am helping people with a unit at the Uni that people commonly struggle with; Object Orientated programming.
At the time, I didn't even care for the unit that I felt was mindnumbingly unengaging. However, it is a core part of programming within the games industry (and probably most other programming jobs). Many people think that the unit is very "foreign" to them, as they learn C# and they never use it in other units, but by year 3 I realised that most of these languages "do" the same things, just in a different way, and that learning the ideas behind this unit early on helps make a lot of other things just "make sense".
The problem isn't really that everyone knows Java, it is that they know *only* Java. I am now getting the students to all basically program the same things that they are doing in another unit (using C++) but with C#, making use of unique functionality that it can provide. I think that on any CS course that is not targetted at a specific profession that this is far more important as I would imagine that transferrable skills (and knowledge) is more important for a job with uncertain career prospects at the end.
Looking back over the last year-and-a-bit since graduation, all I really have to show for my time unemployed is a few contracts for different IT related work and a portfolio. My 2:1 doesn't even get me a reply from most places that I have applied to. So my advice to anyone chasing a job in the IT industry is - don't get an IT related degree. Think about what you want to really do and, if it involves programming, do some self study.
Programming is easy. A REAL degree is something like Applied maths or physics. I would imagine that a candidate with even a 2:2 in either of those and under hobbies "programming" would get an interview with more certainty than a candidate with a 2:1 in CS whose hobbies were "applied maths/physics". So if you do CS, make the most of what the course offers and find a way to set yourself apart from everyone else or you *will* end up unemployed.
So if you want to do CS, think about what you even want to do at the end of it because programming is not the hardest thing to do in the world and, really, is not represented by a degree. That is something for your portfolio instead. The degree really should set you aside from other candidates. Hell even a degree in ancient history would have set me apart (and its what I originally applied to do) and I reckon I would have a better chance at applying for jobs.
I had to chuckle when you mentioned archaeologists' job prospects. When I graduated (some time ago now!) with a BSc in Archaeology I had high hopes of becoming a post excavation bone specialist. But, the reality of a junior site archaeologists' pay shocked me so much I retrained (myself) and became an Access developer. Many years later and I'm a senior SQL DBA steeped in T-SQL and (now) .NET.
Despite my lack of an academic background in IT (though I started out programming 68000 assembler in my teens), my non CS background has benefited me with reasonable social and communication skills which I'm convinced has won me many a contract over my CS graduate competitors.
I graduated from a South African university (Rhodes) in 2006. In my 4 years, I learned VBA, Java, C++, SQL, C#, Haskell, HTML (who doesnt?), networking, computer architecture, operating systems, distributed and parallel processing, algorithm complexity, turing machines and other state machines, object oriented design and programming, and some other stuff I cant recall right now.
Bottom line: hire South Africans!
I'm not the oldest around, but...
Seriously though, most of what I saw listed there would not excite me as a potential employer, even a hobby programmer will have used those techniques and most of the languages at different times. I want practical experience from people who have fallen over the reality of these things not the theory.
I wouldn't employ any CS grads these days for all of the reasons given in the article. I would select good science or maths grads who show some imagination. I have been in this industry for almost 50 years and still work providing IT support for small business (and friends, family, neigbours, etc.) making a modest living. I frequently come across gross incompetance by so called IT professionals who are the result of this dumming down of IT education. I have provided IT support to schools and I know that from junior schools onwards IT is not taught properly, only playing with browsing, Word, Excel, and PowerPoint. All very uninspiring, and most kids could find that out for themselves anyway. The syllabus needs a dramatic overhaul.
The IT industry isnt the only industry that suffers with this.
I find that college/universitys often push students into these roles when they probably shouldnt be there.
I tried (in the USA) for two years to get into IT courses only to be rejected every year. The reason was quite simple, because everyone was going for these courses getting the starting courses was a lottery, with many courses getting filled before they even oficially opened.
I also seen some of these starter courses and let me tell you, they make you sick..
You learn about things such as "The Windows Button", "The reason we have an X in the corner of a program window", "Using your mouse properly".
I did manage to get into a CAD Course (Hoping it would allow me to get into some of the other courses but it didnt) and my first day In I had completed all of the coursework for the term :(..
I gave up on it and went into a industrial course where roughly 97% of people fail... The joy of being one of 2 people who graduate a course is quite high. The sad thing is that my industrial course taught me more about IT than any of my friends ever learned in college. Okay, One pal has managed to start (and burn) 3 multi-million dollar IT companys. But at least I understand assembly language for several systems.
Sadly, like the it bods, I have horrid grammatical skills ;P..
My university education boiled down to a set of mostly uninspiring lecturers delivering material that was either too simple, out of date, irrelevent, badly explained or hideously over-complex. The one standout lecturer I can remember unfortunately taught a subject I was unable to handle mathematically (heat transfer calculations). I did get my first IT job related to IT interests I pursued aside from my degree, though, so it wasn't as if I lacked enthusiasm.
The major problem I see is that the majority of the jobs don't require a solid understanding of algorithms, and very few need more than a superficial understanding of architecture or OS/compiler design. The delights of capitalism and companies in general means that 'good enough' and fast delivery counts more than proper planning and decent design.
At one point I wondered if I was simply not interested in computing. A moment's reflection on the amount of tweaking I do outside work showed that to be false; it is the issue that there are no role models to push you to succeed, little incentive when you do and a whole boatload of tedious politics to manoeuvre at each point.
Then, there's the matter of pay. The areas that are worth pursuing seem to be poorly paid, and a healthy work life balance is more important than a perfect job. It doesn't help that I have negative desire to migrate to the countryside less urban sink that is London.
My plan? To avoid the creeping lifestyle upgrade-itis of the general populace, pay off the mortgage and generate my own independent income stream. I'm tired of the shit from working for other people, and fancy the hard work of setting things up on my own.
I'd like to believe there's a world changing, well paid job that doesn't consume all your waking hours, but personally I'm sceptical that it exists.
I got my CS Hons 20 years ago and although I got a lot of out of the course, its because I was interested in the subject. The bits that made me commercially viable were all self taught during the period of the course. So many graduates were useless then as they are now.
Software was in my blood since I had a BBC but for a lot of CS students its a career path and there is no passion there. It makes all the difference.
As for Java, I was quite cinical until recently. I need C++ programmers but found that when i targetted C++ programmers I got a lot of people that really cant code in OOP fashion as many evolved from C. I then broadened the net and was interviewing Java coders. They wrote in OOP fashion because thats all they know and they were able to transition to C++ in a heartbeat. Now I favour Java developers for C++ position. Stange, but true.
So although I will continue to take the piss out of Java , as a learning language its excelent. And the reason why we dont code in Java is because there is a performance aspect to what we do and Java stuff runs 1/2 the speed or worse. Other than that, seems quite nice.
I also believe that students should be tought about the internals, it helps. But again this has always been tought to varying degrees. I remember to my dismay 15 years ago having to describe the difference between heap and stack whilst helping one my development groups trying to identify the cause of a memory leak and looking completely in the wrong place.
Something occurs to me which a few people have mentioned in passing but I think needs bringing to prominence.
1) The size of the IT/CS field has changed but I'm not sure expectations from companies & recruiters have quite tracked reality. Consider the job adverts mentioned where they ask for
* literally impossible amounts of experience in certain technologies (or specific versions thereof)
* fail to understand the similarities between different technologies (If you're looking for somebody with experience of Oracle and the role isn't a DBA, it is such a big leap to consider somebody with experience of DB2 or SQLServer or even a different version of Oracle?). Even more bold, want C# developers, why not consider experienced Java developers - not that big a learning curve.
* fail to understand the differences between technologies - hiring a C++ developer to work on your trading system, doesn't mean you automatically get somebody who can admin your database and design your web UI also. Which brings me onto my next thought...
2) If you consder other professional fields such as law and medicine, nobody questions the concept and value of specialisation. If I have a heart problem I see a cardiac specialist, not a dermatologist. If I have a company contract dispute, I don't see a divorce lawyer. Most people accept this as a valid approach. I would agree that keeping your skills up to date and possessing a curiosity about the field are valuable attitudes - but I do wonder if this willingness to turn our hands to any problem solving doesn't cost us some perceived respect when non-technical people assume it must be easy since we could do it.
Which brings me to my final point
3) All this love of the field and continual updating of skills and training is fine. But how much of it is on our own time and expense and not our employers? How many lawyers do you expect practice litigation in their spare time, or doctors perform surgery on friends and family members to try out new things or even accountants play at bookkeeping for fun when they get home just for their love of the field. I'd imagine they normally get this kind of training and development on their employers time and dime. Google's idea of 20% personal time to experiment is great, but how many companies are brave enough to do that?
Here is the same. Most of young people knows nothing about principals. These days, too many IT professionals are really some boys/girls who can type a letter or send an email. The amount of lack of understand for basic stuff is unbelievable. Eg. expensive software "engineer" does not understand how to write in clear good structure, and senior tech cannot understand the simplest command line (I guess they never used CLI).
AC because I may offend someone I do not want to (yet).
Our so called "education" system has failed long ago.
This post has been deleted by its author
It's the understanding and analytical skills behind you that are more important! I studied at the University of Manchester and over my four years gained a HUGE amount of basic skills!
Employers want to see broad skills - grads are NOT expected to be experts, but they should be able to pick up stuff quickly! And thats exactly what my Uni offered me!
My shallow skillset means I have some experience on the following:
Opamps (as A->D converters), silicon transistor design (why the p region needs to abour 2x n-region), constructing gates from transistors, constructing adders from gates, and then 16 bit adders, microprocessor architecture (buses, ALU etc.), ARM assembly (interrupt programming, memory maping, thumb etc.), OS architecture (paging, cache...) , networking/security (dhcp state mahine, FHSS, DSSS, 802 packet structure....), C/C++, Java, VHDL.
I've forgotten most of it - but in reality its still there in the back of my mind. As a grad I'm confident I can go into any job and be reasoably confident at it!
Unlike the supposed grads, who just know Java and have done somethign silly like Business Technology crap.
Reading that hurt. I don't know if it's my instinctual reaction, as a CS student, to take a little offence to some of the allegations in the article, but I'll shrug it off!
I guess I must've chosen a good university in Cardiff, because all the things mentioned in the article that CS grads are lacking, are all taught here. I'm currently in my third and final year and we've had modules on Operating Systems, Algorithms and Data Structures, lots of database modules (object orientated stuff, SQL, web), several maths modules, and I've never had a module about computer ethics.
Sure, our main learning language is Java too; universities have to teach at least one language in order for students to be able to implement the techniques that they learn about. They don't teach in-depth knowledge of lots of languages, because programming skills are very transferable once you've got them down. We did however have modules that touched on C/C++, web languages, and MATLAB.
The way I thought it would be, for companies looking to employ graduates, is that they'd look for a graduate with a strong foundation of all things computing, and programming, and they'd train him in the particulars (and he'd pick it up fast if it was a new programming language, or something). They are after all just graduates, if you're looking for an experienced programmer who knows 15 languages, you're probably not going to find that in a graduate, I would have thought?
Obviously the author of this article knows more about employing CS grads than me, but there's no chance that anyone with a half decent degree classification from my course at my uni, would fit in with the image portrayed in article.
I have other experience. When I told the companies I graduated in hardcore math and hard core algorithm theory and hard core CS (including C/C++/asm/etc), no one was interested. All they asked about was if I knew how to do web programming and applied stuff.
-Can you do a web page with some scripts?
-Ehmm.. No, but I can learn in an hour
-Thanx, but we need web programmers. Good bye.
That made me regret that I didnt do much more applied stuff. But now I am satisfied, I did the right choice. Dumb companies to not appreciate my skills.
Option 1:
Go to Uni and do a 3-4-5 year course with no huge prospect of a job. Fund the Uni through taxes and levies and fees, employ postgrads and add a dash of income just in case there is a reader, research arrangement at said Uni.
Option 2: (sandwich course model)
Place onus on industry with inducements (say half the fees that the Uni gets?) and a bob or two for the sandwich munching learner at places of employment.
Twiddle these places of employment so learner is not stuck in one place for duration, couple that with (paid for?) visits to reach-out marketing initiatives put forth by likes of Adobe, Apple, Microsoft, IBM (? do they still do that?), ..
In other words
(a) a wholly confined within Uni course
or
(b) a placement based training initiative with rounds of employers, rounds of conferences and rounds of visits to Uni to pick up the theory.
There may be other alternatives but the extremes above seem reasonable with one option being outstanding (but I would say that wouldn't I?)
The trouble is: it treats CompSci or IT related courses almost as NVQ in plumbing?
FYI Java is the #1 language requested in job advertisements. So it's not surprising that it's the one taught in universities at all. See http://langpop.com/ or http://www.tiobe.com/index.php/content/paperinfo/tpci/index.html . I agree with the author's supply & demand rationale, but in practice the highest demand is for Java.
I'm a Java developer [surprised? :)] (graduate of 2003), employed, and whilst I disagree with the author on language choice, I agree that so many new graduates have no idea how basic elements of computers work.
To be a really good programmer I'd expect candidates to have *curiosity about how things work*.
Most coding is - to be honest - the blue-collar work of the 21st century; it can be easily taught to a 'good enough' standard and then carried out by the Indians/Chinese.
Always assuming 'what to code' has been adequately specified !
The sorts of things I look for in a compsci graduate are more business-skills based as well as the techie stuff. Sure, you might need someone who knows about real-time programming and interfacing to the real-world, but you also need people with decent skills in business/systems-analysis, project-management, risk-management. That's actually far more important a skill-set than just 'what language you code in'.
Being a secondary school teacher of ICT, I object! Most of the time our job is to teach students how to use computers- not Computer Science.
Computer Science is only covered at A Level. This is because of the National Curriculum (mostly-yes some ICT teachers couldn't program if there life depended on it).
What use is an indepth knowledge of computer science if a student can't do basic functions on a computer first, (they can't when they start at my school). My school has a programming club and computer animation club, yet this is not our main objective, our main objective is to teach students to communicate using IT.
I consider myself to have more than half a clue about computer science (having studied CS at Reading(again same problem with lecturers) Don't blame the teachers that there aren't any good CS Grads, we barely get to discuss programming with them before they're out the door.!
that basic computer use should be a short course in basic computer use, not a full 2 year GSCE in Information and communications technology. You do have 3 years to bring them up to speed before they 'get stuck in'.
I did a GCSE in Information Systems, many years ago, and actually using a computer was pretty much an aside to the course material. Ok it was pretty simplistic, but it was at least vaguely geared towards IT concepts and theories, not just office skills.
It looks to me like the guy is looking for specific skill sets that he can easily slot into particular job opportunities. He's a sodding recruiter of course he does, that's where his money comes from.
I'm sick to death of employers complaining of skill shortages, especially among the newly qualified. If they are finding it difficult to get someone to do a job then they should either train somebody to it, pay the going rate for the someone who can do it or STFU.
We at EduGeek have been saying this for quite some time. It is to our great shame that the UK has the best IT equipped schools in the world, yet the curriculum is completely useless in its realisation of the skills required in the real world. Secondary school children are taught what would have been called 'office skills' (Word, Excel, PP etc.) by a collection of former English, maths and science teachers. Very few schools are lucky enough to have 'techy' teachers who know anything beyond point-and-click. I left school at 16 in the mid-80's and had taken 'computer studies'. This course dealt with all aspects of computing including having to learn a programming language and produce projects using this language (BASIC), how processors and other components work, how computers operate in larger environments and a whole raft of areas that no longer get taught in the ICT curriculum now delivered in our schools.
Partly this is down to the lack of specialist talent teaching the subject in schools, and partly to the curriculum which doesn’t demand that talent needing to be present in the first place. I recall once being told that ICT in schools is about ‘teaching and learning’ and that the pupils learning of processes and problem solving were what mattered, not the fact that they were being taught a subject that overlooked 90% of what it should be doing.
Really long comment thread; and I have a lecture soon so sorry if I am repeating points that have already been made.
I did not do Computer Science (or anything like it) at GCSE or A-Level, my school didn't offer them; so it is kinda nice that we are being gently introduced to programming. I am so sorry to all of you who spent your childhoods programming in your bedroom but I didn't. When you start university you have people with vastly different backgrounds and educations who are all at different standards which is why the first year does not count towards your final grade so that everyone can be brought up to the same standard.
Secondly, I spent my gap year working in the IT department for a stockbroker and most of the problems I faced required little knowledge of the inner workings of a computer. Mostly I required a knowledge of people and the software they were using. Furthermore I picked up any knowledge I needed along the way; when I joined the company I had never used a server, or AD or a BES, nor had I ever written any HTML. By the end I was managing our corporate website and all the company blackberry devices. I feel that their will always be an element of on-the-job training with IT as there are simply too many different operating systems etc to teach them at university.
Moreover, one of the first things we were told when we rocked up to our first programming lecture was that it was almost impossible to teach programming; their aim was to provide an environment where we could teach ourselves. So far I have been happy with the teaching I am receiving; but I'm only a month in so we will have to see. Btw we are learning C atm.
The teaching at my school, on the other hand was horrific. I did my A-level Physics coursework research essay on how a cpu works and was appalled by the fact that my teacher could not explain what a transistor was. The IT teaching at the school consisted of the the ECDL (european computer driving license) which was a complete waste of time.
BUT most students going through the education system will only ever use Word, Excel, email and the like and so have no need for an in-depth knowledge of computer. Have a go at CS degrees, CS A-level and GCSE but ICT teaching in schools is mainly for the masses (as has just been mentioned, I believe).
Did have more to say but the aforementioned lecture is approaching so I better go;
The ECDL should be up before the Trades Descriptions Act. A Licence is a legal or contractural permission to do something, performance without such license being a breach of civil or criminal law. The ECDL is *not* this, it is a statement of (minimal) competacy.
As an IT professional working in an academic environment, it is rather unfortunate that I have to agree with many of the points raised.
However, I can assure you this is not yet a lost cause. Here at The University of Sheffield we have 40+ computer science and software engineering students engaged in our Genesys Solutions module each year (plus a few additional students from other disciplines). The students are supported by epiGenesys, a University spin-out company, and deliver software solutions for real clients. They also program using Ruby on Rails, definitely not Java!
More info here: http://www.genesys.shef.ac.uk
Our youngest spawn is doing Compsci (and Genetics)* at a uni in Southern New Zealand. They make them all do a paper in written expressive English (which has done wonders for the spawn who is enjoying the effects) and they don't get a choice in terms of whether or not to learn C++ they have to (spawn loves it to bits). Back home people who don't learn how to program are not CompSci grads they are doing IT, and Information Technology is for Java 'programmers'.
So, time to differentiate the real programmers from the rest, stop calling both by the same name. And teach them how to write and that spelling is important, it isn't hard and the English Dept will thank you for the work, most especially now the govt don't fund them any more.
*Bioinformatics IOW, LOTS of demand for proper bioinfo people, nice salaries too. It's hard work doing a double major mind. So not for the faint hearted. Mrs Muscleguy (Maths and Compsci) and I (Biology BSc and PhD) are proud of our mutant hybrid spawn.
"This is in spite of Math/CS being in the top band for lifetime earnings."
So what.
Do you know the saying: "A hacker is someone that will do for free what another won't do even for money"?
Money will not motivate people to be interested in technology. And if, then only to the amount of making enough money for the least amount of effort. Hence all the Javanese coffee drinkers. They see it as being enough to get a steady job and not having to torture yourself with the more complicated stuff. It is the most cost-efficient way: the least pain for a good enough gain.
Someone with an actual interest in technology will see earnings as secondary. In order to make the effort of understanding the bowels of the system or creating something good one must be driven by a genuine curiosity to find out how it works and how it can be improved. Thinking about money and earnings will hardly be of any help in this undertaking.
This post has been deleted by its author
I feel this is a miss representation of the University I am at, I did learn java in my first year, and as a newbie it was a good introduction. We also learn quite a few server-side and client-side web technologies. Along with mips, and quite a bit of theory.
I am currently doing C, operating systems (current project is memory management in C++). I think I am taking advanced C++ and Advanced Java next term, with a few other things.
Yes funding might be better, but this is a problem for everyone not just CS.
The comment of not being ready for the working world, might be true, but we learn such a vast range of stuff that when entering a company we would always have to relearn for what that company is doing.
I just feel this article was just a bad one sided vet with not accurate facts.
I think that I must be getting old, but when I started in this business, there was no such thing as IT (it was know as data processing, which was exactly what it was), and you needed to work for a company with a few mill to drop on the kit. This mostly had a three-letter acronym on the side, was blue, and lived in a big, air-conditioned room. The company which produced this kit also published a big book, known as "Principles of Operation". If you could be bothered to read and understand that, the world was your oyster. Still is...
These days, I mostly write "C" (yeugh!), but it is always satisfying to tell my colleagues exactly what the machine is doing when they get the always-entertaining S0C4 following a crap pointer.
But I digress...
In the modern world, there are toolchains and operating systems for free, with all of the source code. There has never been a better time to learn whatever you want about how computers work with very little outlay to yourself. If someone is interested enough, they can learn about anything they damn well please, from DSP to vector graphics to relational databases.
You can even have a mainframe on your laptop!
If you are getting people looking for jobs who can't be bothered to take a little interest in their subject beyond the spoon-fed stuff, then I suspect you might be searching in the wrong places.
Degree? I know lots of excellent people with none, very few with CS. Mine is Chemistry, so I had to educate myself as regards computers. Experience counts, so get some. Join an open-source project, contribute stuff. It won't kill you and you might learn something useful which can translate into $$'s down the line.
Somehow I don't see decent software going out of fasion real soon now!
I recently graduated this year only to find myself completely unprepared for employment. In many cases my lectures were simply an exercise in staying awake whilst lecturer read Wikipedia articles out from a big screen, or had pasted them into slides with the links still in from years ago, some lecturers were not fluent in English, and at one point there was 800 students for a 30 computer lab. I found cheating in exams and coursework was rife and I considered quitting university because of poor support and apathetic staff who closed ranks and penalised grades when concerns were raised. I even had an angry head of department yell at my parents( ! ) from china after i raised a concern about teaching continuity.
Most of the people who graduated with me had no clue about computers and some were simply unable to read or write in English at all. I felt utterly cheated by a system i thought would teach me the skills i needed but took my money, frustrated me with useless projects like having 6 months to make a house in second life, and then printed me a useless piece of paper at the end.
Thankfully when that rather expensive piece of paper got me an interview i was able to show that i could do more than just drool and mash the keys with my face.
I think comp sci has gone too mainstream. My experience is that most uni students have not spent their younger years programming as a past time (i.e. no deep intrest in computers / dev). Rather, they may be academically smart and one day decide comp sci degree is the way forward.
I laughed when a lecturer said this, but now I do agree, development / I.T / design is a very creative task and really cant be taught (similar to creative writing, music writing) -- It takes hours upon hours of practice driven by a solid passion in the subject (something uni can not teach / you can not pay for)
Java bashing :/ although there is a great correlation between java developers and bad developers, just remember correlation does not imply causation. Java's a great language, just like c++ and er, c... Java is just being abused / misunderstood :/
so i'm unemployed and i'm finding it very hard to get into a new job, not being picky either,
but i'm 'old' ( 48 ) so i guess i'm on the scrap heap already, oh and i have aspergers
how does that fit in with all this?
oh and in the last job i had 1/2 the team were from india and they were truly awful, with no real idea of how computers and systems actually worked
anyone got a job going in london?
I can tell a lot of people posting are old timers like myself. I've been a PAID computer programmer for over 20 years, I played with them maybe another decade. I do remember my written language skills to be sub-par at one time.
One of the early jobs I applied for had a very simple coding exercise. I think it took me 3 minutes to code, at most. I asked for the next piece, but was told that, that was all. Apparently this tiny bit code flummoxed all the other applicants. Now its not that I was especially bright, it was that the wanna be coders were, well, unable to do anything real.
We are spoiled today. The old Apple ][ I played on took several minutes to load up a game, a
game that would barely qualify as an applet now. We had floppy drives that had disks with a capacity much, much smaller than what you have now and hard drives were all but non existent.
So, we had to learn to code, code that was smaller and more efficient. You were encouraged to look for ways to save time/space. The Y2K "bug" was not a bug at all, it was the result of people trying to continue to use code that was not upgraded to deal with the current realities. It was never intended to be used as long as it was without replacement. Perhaps the "bug" was the lack of documentation, but really its about planning. The code should have been properly replaced years before instead of in a panic at the end.
The thing is, so many programmers are taught using suites that really hold their hands. I'm personally spoiled in that I can search the web and retrieve coding examples of so much of what I need to accomplish that I don't have to reinvent the wheel. The difference being that I can ultimately come up with a solution in the vast majority of the cases myself, but since I'm using libraries of code for some fairly complex things, it makes sense to use something if its already available. In other words, its cheaper for my company for me to use something someone else made rather than waste my hours to accomplish the same thing. It's why we don't all write our own word processors.
I'm actually somewhat happy that so many newbies suck. Really, it makes my job that much more secure. There will always be a percentage of the population that love to code, the lot that want to read a program in 24 hours book and act like a professional SHOULD fail and leave the coding to the people that spend the time to think before they code.
I do agree that c++ is a pain in the butt. I happen to loathe pointers and I do admire those people who create systems that allow me to fob that bit off on the compiler and let it do more of the garbage memory collection for me.
Kinda reminds me when I was 14 in the 80's and went into a music store and started messing with the computers hooked up to midi interfaces and a sign on the C-64 said it was broken or something but the screen looked fine and so I did that poke command and the music software loaded up just fine and the store manager came rushing over and my mother just walked in and saw us talking -
without listening my mother instanlty apologised for me messing with the stuff but the manager thanked me for getting it up and running and I gave him the command to start the program.
Also our computer teacher was fired and I was put in charge of the tandy model 3/4 star network.
After leaving the music store my mother blushed saying "GOOD JOB!"
Too bad we do not have the passion anymore in the USA.
I'm a self educated programmer. I chose that road because I suffer from severe ADD and a classroom environment makes it nearly impossible for me to learn anything. Despite the fact that IQ tests put me in the 'borderline genius' area I barely made it through high school because of it.
If what you're saying is true, my attention deficit, self educated ass is more qualified for most of the programming jobs I see than the typical CS grad.
On the other hand, maybe it will help the next time I go on a job hunt.
Anon out of shear embarrassment at not having finished college, even if I do have a reasonable excuse.
I left uni (an old poly) 8 years ago and I really enjoyed some of the stuff we did - C, C++, Java, Prolog, Lisp, VB, various weird maths languages - as even though a lot of it probably wouldn't feature in my career (prolog being the most obvious) it was very interesting to see the different approaches to programming. I feel sorry for any CS/SE students who are stuck just using one language. That said though, another student and I taught ourselves C# in the final year and were the first two to get a job (everyone else was only competent with Java and the market seemed flooded even back then).
I have noticed a lot of "don't need a comp-sci degree" comments, I beg to differ in some respects.
In general I have noticed that those without a wide range of training from a compsci degree tend to ignore any area other than their own when performing analysis. I have had designs and programs given to me by some professionals that fail the basics of flexibility, modularity and that can stomp all over anything else in the infrastructure.
Unless you get that breadth of exposure to area of computer science through formal training, you leave yourself open to making schoolboy errors.
This article is patently false.... Have you researched all of the comp sci courses from British Universities and averaged their content to see what programs offer the most varied range of subjects?
I studied at and graduated from Computer Science at The University of Reading in 2005. During my time reading @ Reading (ba dum tish!) I studied everything you are whinging about, C++, C, Java, Pascal, C#, POP11, even CAML. We had a thorough grounding in formal database languages and SQL. We studied Operating Systems in C (including their algorithms), formal algorithms in, well, pseudo code, amongst other interesting and completely varied subjects.
It's simply not true that we don't learn "proper" programming any more. Taking a sample of one university and bemoaning that all of us cannot spell is irresponsible journalism and scaremongering.
Boo-urns sir, boo-urns.
not to come across all daily mail on the reg but a reason for this attitude towards CS(and related subjects) is the 1/3rd of the class from China. This is because with a few exceptions they could barely get by on any of the current course material with the ones that could doing the coursework for the ones that couldn't.
This led to several "banker" exams, stuff so easy most people who spent more than 5 minutes studying could ace it, it shouldn't be possible to get 100% in an exam worth 40% of a double module unless you were at Turing's level! However without these banker exams most of the foreign students would fail, then probably not pay fee's or next years intake would dwindle and so on and so on.
Not that this is just a "bloody foreigners" situation another 1/3rd is made up of time wasting Hugo and Rory's who are taking CS subjects because they wanted an excuse to party for 3 years and daddy said it had to be a useful subject. These usually get 2:2's were it not for the banker exams and do not want to study simply want to pass the exam, forgetting everything they learned the previous year when they roll up let alone when they finish the course. If these types had to lean c++ they wouldn't bother turing up and the uni looses its funding. this kind of happens with the "video game design" course that everyone joins up to and then panics when they have to do lessons in maths of 3d movement
This leaves the remaining 1/3rd. these are people that actually want to be there and will learn the material they are given. however that's the point, the material they are given. Very few will look at other technologies/methods/area's and the problem is that when they do they aren't exactly encouraged. This is because the marking guidelines are becoming more of a tick box system if you don't include X,Y and Z than your mark stays in the 2:1 range, doesn't matter what else you've done.
Then of course you have the usual problems, image, pre uni CS being crap or non existent, lack of funding unless mummy and daddy pay your way and to top it all off if you finally do have a talent for it to the point that you get a letter with your dissertation results from the exam board about your excellent and exceptional dissertation hoping you'll continue study's etc THERE'S NO FSCKING FUNDING! so unless mummy and daddy will pay for your masters your screwed.
A class of 2010 AC who unlike most of my peers did study and now has a job doing what i love unlike most of my peers who are using there degree's in the local call center, daddy's accounting business or PC World
I pretty much ignore the CV. It's either made-up, exaggerated, or the poor victim's been fooled into thinking they know something.
Anyone applying for a coding job gets sent a little exam paper, which they can complete at their leisure, and bring the answers with them to interview. (Or, if there's lots of applicants, send them in for pre-screening).
The point is, they can cheat. They can ask their mates. Or use a book, or, crikey, even the Internet. I'm not really interested in the answers - often there's isn't a right/wrong answer anyway - but I am interested in their initiative and problem solving skills. That's what they'll need if I employ them*.
So far, over a couple of decades, none of my successful ones have had a degree in anything remotely related to IT. (Though I do, and that's taught me that I don't know enough to do the coding myself!)
* although in one case I married the applicant instead :-)
I'm a lecturer in CS and I agree: the truth about a lot of UK degrees (in particular in ex-polys) is they aren't worth the paper they are written on. Students are taken on courses even if they can't read, write or do even the simplest of sums. Lecturers are given "targets" for pass-rates, so they make exams and coursework as simple as possible: the pass rates go up, who cares how, but everything is fine. In many cases they abolish exams completely, so you have 100% coursework modules, where everyone knows cheating and plagiarism are rife. But who cares? Pass rates are up so that's ok. Final year projects (when they are not bought off the net) often consist of a three of webpages put together in as many hours, but that's ok, we call it "web development" and give them a 2.1 because we have targets to meet.
I'm a lecturer in CS and I agree: the truth about a lot of UK degrees (in particular in ex-polys) is they aren't worth the paper they are written on. Students are taken on courses even if they can't read, write or do even the simplest of sums. Lecturers are given "targets" for pass-rates, so they make exams and coursework as simple as possible: the pass rates go up, who cares how, but everything is fine. In many cases they abolish exams completely, so you have 100% coursework modules, where everyone knows cheating and plagiarism are rife. But who cares? Pass rates are up so that's ok. Final year projects (when they are not bought off the net) often consist of a three of webpages put together in as many hours, but that's ok, we call it "web development" and give them a 2.1 because we have targets to meet.
Headhunter fails to understand what a degree in Computer Science actually is.
It's not a degree in programming. If you want a programmer, go find someone who studied physics, who would have programmed simulations in C. They won't know any of the underlying principles of computing though.
The real problem is that of getting someone who thinks - and we are seeing the results of schooling to pass tests rather than educating to think. Rote learning is a medieval system akin to monastic chant memorisation, yet it is the best route to getting pupils to pass rudimentary SAT exams every other year. Thinking critically now appears to be something that is learned later in life, if at all.
Someone that can think, can adapt to any language (indeed languages are easy, and most courses should be teaching one C-style language (and yes, Java counts here), a functional language (ML, Haskell, F#, etc), and maybe some others - scripting languages, etc. Indeed most universities will teach the basics, learning the APIs and learning other languages is *expected* but not taught.
And 'ethics' - sorry, a headhunter might not understand why this is present, but even Cambridge University has ethics, and has had it for a very long time.
Seriously, how many people 'recruiting' just automatically 'require' a certain level of degree? I often - even after 25 years of experience - get asked "what level of degree did you get, a 2:1 or above? My client only wants people with a 2:1 or better". That coversation leads to an abrupt withdrawal of my interest. I don't want to work for someone who is interested in the paper I got over 25 years ago and not what I've done with my life since!
What about looking for people who have gone through apprenticeships? ScrumIT is on the internet running a very successful apprenticeship scheme turning out programmers with a wide range of real programming - and no 'ethics' training.
What about looking for people who have taken this up as a hobby, maybe completely dropping out of school with no qualification but a wonderfully honed programming ability.
While we are at it - just because I have spent 25 years doing embedded code doesn't mean I couldn't possibly understand and write your financial application. Just because I've project managed the development of a telephone doesn't exclude me from project managing a bunch of people writing a pension application (I might not find it thrilling...). The rules that apply to middle management and operatives are never applied to upper management - you can sink a highstreet bank and then get head hunted to run a national chemist chain, you can make a hash of running a government and invade many countries then get employed as a peace envoy, you can be in charge of education one monent and the foreign office the next, in charge of defence having never seen a uniform, move from running a hospital to running a software company (Symbian in that case).... but you can't go from writing C++ for a phone to writing C++ for a pension fund, or from writing C++ to C# or Java... its all wrong..
The problem is not the useless universities or the second rate graduates its the stupidly narrow minded recruiters.
Still looks pretty good:
http://studium.ba-bw.de/index.php?id=210&no_cache=1&tx_babwmod_pi1[sp_uid]=8&tx_babwmod_pi1[so_uid]=55
Use translate.google.com to translate the keywords. Here in Frankfurt it seems they train XML monkeys, but my Uni still has Compiler Construction. It seems I made an excellent decision.
Today's CS graduates are generally speaking little more than script kiddies. When CS syllabuses started including psychology, business, and all sorts of other 'soft' skills at the cost of learning OS fundamentals, hardware, multiple high level languages or the underlying math behind various methods it was clear that the end was nigh. When I left college in '90 there were already Post Graduate Diplomas in IT, these folks were actually fooled into believing that having their PGDIT qualification in addition to their Arts or Social Sciences degree made them computer scientists. The problem was - of course - that the 'skills' that they learned on this PGDIT were entirely end-user. Today it would be the equivalent of teaching a post grad course at college that teaches Excel, Word and Access and a little bit of macro work, but telling the students that this was fundamental tech stuff and that they were getting a good grounding in everything. HA!
The thing about all those 'useless' languages that I learned is that they taught me method, process, analysis and ground up programming. Learning assembly language or literally entering data directly into the console of a truly ancient PDP-11 taught me about the connection between what we see on a screen and what goes on inside the box. Remember that box? Building single board systems taught me system fundamentals. Statistics, mathematics, numerical analysis, these all taught me basic tools which I could use to make new things. Teaching me a broad range of software engineering skills instead of a specific method, teaching me fundamental database skills like data analysis and normalization, teaching me how a compiler and an assembler works, and then making me build one, Oh, I could go on, there were so many things we did on my ancient CS degree that I know for certain are no longer taught. Granted technology has moved on, but underlying everything, under all the layers of abstraction and scripting the same fundamental things happen, and have to happen. There is no other way.
How many kids coming out of college/uni with CS degrees today could program an efficient binary sort? How many even know what one is? How many could recognize the need for such an algorithm? How many could build an assembler or even know why it's necessary to have one? How many understand the concepts and implementation of virtual machines so necessary for Java?
I find myself being one of the few tech folks where I work who even has a feel for the stuff we work with. I mean I can see an application or a system behavior and diagnose it with ease, I know what is going on under the wrapper, I know how the system is built and how it does what it does. I can interpret the behavior and determine the issue far more quickly because of my depth of understanding, not because I am trained in the specific system but because I understand the principles.My younger colleges with degrees that are a decade or more newer are annoyed and not really impressed that old greybeard can do this, because they cannot. It's pure jealousy.
The worst thing though is that as time has gone on, my experience and understanding have become a liability. I'm no longer as near the hardware as I was when PCs and networks were younger. Now there's a whole department to manage the servers and networks. Security teams to dream up new and pointless regulations and restrictions on what we can, and can't do. It's got to the point that because so few people on staff have a real depth of understanding, that even though I *know* a solution, or *know* how to get something done because I actually understand how the system operates and integrates. Even though I am in a position to literally hand the resolution over, I am not believed, or trusted to have the solution. It's obviously impossible that I could have a solution to a problem I have never seen before. So we have to go to the vendor with some 20-odd year old spotty faced youth who looks up his knowledge base for a living. But because spotty youth doesn't know we have to wait for the second level support tech who scratches his marginally older head in doubt and refers it to the devs. Eventually the developers or some grey beard in a tertiary support role get's wind of the issue and tells us what to do, which is exactly what I've been urging the server folks to do since day 1. But, they simply could not take my word for it and have to wait for the vendor to tell us.
Sometimes knowing too much is both a professional and a personal liability. The stress that comes from knowing a solution but being ignored in preference to a third party vendor is simply immense. I put the blame for this squarely on the shoulders of fashion conscious CS programs trying to attract more students. In my humble opinion, they'd be better going for quality over quantity, but that's no how the funding formula works so instead they go for the broad base and attract as many bums on seats as they can. Watering down the curriculum to match the average quality of the incoming students is a necessary evil in that system. In terms of producing good graduates with skills that will serve them all career long, the high quality approach would serve the industry and individual better. Let the arts and social sciences have the post grad diplomas in IT and let them take all those junior roles that require some tech exposure, but let's go back to basics and start educating more computer scientists who can go on to become the core IT professionals and teachers of the future.
</rant>
Happened to me:
Me (wizerned 40-year-old): Here's a draft example of how user access permissions should be set up throughout the directory tree of user and shared data.
Pimply-faced Youth: No, everybody will all log on with the same name. You're just the end-user, you don't known anything.
Argh!!!!!!
This post has been deleted by its author
When I went to the U of Iowa, (gruadated 2000) I got a very good CS education *despite* the main stuff all being done in Java. We had algorithms classes ( O(N^2) versus O(n) and all that), a OS internals (which did in fact use Java -- I thought that was pretty messed up -- but it did have a functioning scheduler and all that good stuff...); the saving grace in terms of programming was the "Programming Language Foundations" class where we learned 7 or 8 languages (of 3 or 4 radically different types, including SML) so when a new language comes up, we'd have some experience on using a similar language and pick it up fast. In one class we designed a CPU from the ground up. I took a class on compiler design and one on parallel programming. And of course the usual stuff that a comp sci major should have.
The interesting gap was HARDWARE -- when I was a senior (1999-2000 time frame) some people in my class still didn't own a computer and in fact didn't know how to actually turn one on. We had (academically) designed a CPU, discussed the bus, FPU, even instruction reordering and pipelining. But not even in the hardware classes had anyone brought in a PC, popped open the case, and pointed out what was what -- I think partially that was considered something you'd go to a community college to do, and partially it was just assumed people would take an interest and rip into a machine themselves. Of course, I am a dyed-in-the-wool geek so I'd built my own PCs from components and this would have been review for me. But I think it would have helped some of them make the academic stuff real.
All you bright, shiny management types... you whined about geeky, nerdy CS/EE types not being able to communicate, no social skills (10+ years ago). Here you go. You get WTF the asked for.
basic, simple memory management too hard? hey. use java. Java really is the new COBOL.
But it's ok. just hire those super-qualified East and South Asian wizards, cheap, indentured servants, and super-qualified, right?
What? resumes/CVs padded? web sites and blogs just shamelessly plagiarized from the real creators/inventors? can't communicate? no social skills? wrong timezone?? follow-the-sun not quite working?? (oops? nah.)
Not to worry.
Unis pandering to corporate money and dumb-down their curriculae? No problem. Quantity, not quality, right?
But wait. CS/EE/SWeng/etc. salaries down? "career" lifetime <10 years?? Professional SW engineers forced out at age 35 because they're "too old/obsolete"? the real best&brightest observe this and avoid even the best CS/EE/SWeng unis? Can't find qualified applicants? innovation pipeline empty?
FU. Hire some more smarmy MBAs to blow smoke up your butts.
And enjoy. Wallow in it.
This post has been deleted by its author