Software vs Hardware
Despite only being 28 years old, I remember two things worth factoring in:
1) Usborne books when I was a kid assuring me that "computers don't make human mistakes, that's why they're the ultimate calculating machines.
2) The early incarnations of the Intel Pentium processor that had bugs in floating point arithmetic which made calculations (even in whatever version of excel was around at the time) cock up under certain very rare circumstances. I mean seriously, the computer mags of the time (there was precious little internet then) would give you examples of how you could trip it up, but it was accepted wisdom that unless you were some sort of hardcore science boffin you'd never notice it.)
So as a relatively young IT professional and also as a genuinely experienced and platform-agnostic fella (I'm talking Vista and Linux back to Pick) why am I the only bugger who thinks:
"Sod it, in a new and complex and recently released product this is essentially a teething problem and is pretty much to be expected. Even in a formally released product. So long as they fix it quick-ish in a service pack then all is good"
If Open Office did this people would just write it off as a bug. Just because it's MickeySoft, much as I'm wary of them, "sod it" it's nothing more than an unfortunate oversight. Even the finest ofdevelopers are prone to a scenario that they didn't test appearing as a bug in their software. Big deal, at least you can patch it rather than it being an inherent hardware bug like the floating point stuff in the earlier pentiums. I work for an investment bank where lots of stuff among the traders is still done through excel, and while there are lots of unfixed bugs that they still work around they are still as a whole grateful for such a powerful package that can calculate stuff on the fly like excel does. Much as I dislike Microsoft's OS, the Office Suite is almost beyond reproach - it's excellent.