Re: A question
I haven't yet met a distro of Linux that doesn't include a command that will automatically update EVERYTHING to the latest stable version. Even Slackware has slackpkg now (and that was THE FIRST Linux distro ever and is generally regarded as being only slightly behind Debian in terms of using up-to-date software). And when I say everything, I mean EVERYTHING from firefox to plugins to libraries to kernels to drivers. That's the beauty of aptitude and similar systems - it is literally that easy and if you want, they'll do it on a schedule for you. And it won't trash your OS or make it so you can't revert back easily (Windows Restore you say? Good luck doing that from unbootable computers like I've sometimes struggled to do, and even in the command-line environment of a rescue boot, you still aren't guaranteed to get back where you were).
And I've not YET had a single stable Linux update that broke something I used, even when I have some horrendously complex configurations and interdependencies (I'm sure they did somewhere, but I've never seen one), but I've had Windows Updates disabled on many machines because they would just blue-screen X% of the computers at random and require a rebuild if you just let them apply everything they want.
And, as you point out, Windows doesn't update Firefox and all the other programs and NOR DOES WINDOWS PROVIDE THAT FUNCTIONALITY. If the OS doesn't have a package management paradigm in it, then of course each app will end up bundling its own. But on Ubuntu, say, or Slackware, or Fedora, do you think that Flash installs its own cron job to check updates and bug you like mad if they are 0.0.1 versions out of date? No. Because it provides functionality to do that in a proper, centrally-configured way, and such junk wouldn't be allowed in.
Linux updating, and aptitude especially, is one of the things that Linux gets SO right that it's really hard to argue against it. Hell, I logged onto a 4-year-old netbook today to install a program I'd written for demonstrating at an open day. The program needed SDL and about 10 other libraries installed in order to run and the netbook was running Karmic Koala (which is technically obsolete now). A couple of clicks in the package manager, it ran off and downloaded 100Mb of necessary dependencies and libraries, and then it all "just worked". Those machines were basically bare-metal and it just discovered and installed 100Mb of random software that was necessary, downloaded it (with appropriate permission), installed it all in the right places, and did so in about five minutes.
Yet, on Windows, I still have games that take 20+ minutes to install .NET Framework, DirectX etc. libraries that ALREADY EXIST ON THAT MACHINE, in identical versions, but it just takes that long to check and find out, and usually involves downloading a pseudo-installer that downloads a real installer, that runs an MSI, that manually checks dependencies by trawling through filesystems, then downloads missing parts, and THEN starts all over again for the next bit of software. And, in the end, you still aren't guaranteed that you installed hotfix X needed to make it work properly (just had a piece of large, expensive Windows MIS software that needed a particular Windows hotfix installed, a particular version of NET Framework 1, and a particular version of NET Framework 2, etc. and at no point provided any hint that that was what was missing or where to get it from!).