@Dan: It's time to give up
I've been programming since I was 5 (ok it was basic but we all start somewhere). I taught myself assembly language on the z80 (a ZX spectrum) when I was about 10 and a bit of 6502 when I was a teenager. Aside from some laptops, a 2nd hand IBM PS/2 and a SparcStation I've never owned a computer I didn't build myself.
I passed 4 A levels, went to uni, studied maths, taught myself C at uni, and picked up linux, shell scripting & perl in my spare time while working for £2.50/hr in a computer factory where among other things I diagnosed broken motherboards using an oscilliscope so they could be sent back to the supplier for a refund. I lucked into a better techie job working and after a few years I left it to go contracting.
That was about 10 years ago. Since then I've worked as a contractor and have years of *commercial* experience with both oracle & SQL Server as both DBA and developer, as well as unix & windows admin, networking (I've written perl scripts to auto configure cisco routers for a big ISP), C development on Unix and about 5 years of Java EE (which I also taught myself), XML, XSL, Web Services and so on as well as basic web design. I've done a bit of c# .Net, I once built a website using php and I have run maintained my own DNS, mail and web servers for several years (I picked this up working for ISPs).
Unfortunately most of these skills have been developed and used in conjunction with a popular software suite that runs on said OS, uses said databases and integrates with other systems through the use of C and Java. Which means they count for *shit all* when I apply for a job.
The primary skill shortage in this country is in management, who are generally so clueless as to how IT systems, and particularly people, actually work that they are forced to blindly tick a list of procedural boxes in the hope that it will be enough to cover their arse in the event of a shit-fan collision. Within most organisations technical staff, however multitalented, are pigeonholed into neat little boxes, not allowed to perform functions outside of their silo and often not allowed to interact except through formal processes. An ability to learn and adapt without recourse to formal training goes largely unacknowledged, taking the initiative is frowned upon and a willingness or desire to do things outside of ones formally recognised sphere of expertise is considered dangerous.
In the past, while working for a FTSE 100 company, I have specced a server, taken delivery, fitted the extra processors and RAM, racked it, set up the network, and installed and configured the OS, database, application server, and the app itself. I doubt there is a single listed company that would *allow* a single member of staff to do that these days.
The days of my old boss 'logging on and restarting the process' to fix a problem because he'd got in before the rest of the team are long gone. These days he'd be disciplined by senior management for having his own login.
This is how "IT Best Practice" is implemented in practice when management lack the technical nous to recognise and properly exploit the talented.