The First Law Of Computing
Any error blamed on a computer is the direct result of at least two human errors, one of which is blaming it on the computer in the first place.
No, I didn't get it wrong. I had a… er, computer glitch. There, everyone will believe that. The virtual pub quizmaster isn't looking so credulous, though. He has just reminded me who classical Rome's first emperor was. He does not look in the mood to entertain the idea that I really had typed in A-U-G-U-S-T-U-S but my computer …
"Any error blamed on a computer is the direct result of at least two human errors, one of which is blaming it on the computer in the first place."
Except for cosmic rays. They genuinely do generate non-human errors. I know of at least one occasion of a bit flip caused by a cosmic ray.
"The computer didn't do what I told it to do", "that's not the button I pressed", "Why doesn't it work like it's supposed to?"
to which the answers are generally "Yes it did", "Yes you did" and "It does, why can't you?"
Now there are software flaws that make it through testing, or a random interaction with a lift motor's electromagnetic radiation etc, and there are certainly bad User Interface decisions implemented, but usually the problem arises because the wrong buttons were pressed, or they were pressed in the wrong order because people are careless. It's why the technical aura works so frequently - I come and watch over your shoulder and you just take more care about what you are doing.
Icon because that's when the mistakes are less likely to happen!
I used to be spectacularly bad at User Acceptability Testing because I'd follow the actions in the test script to the letter. What I should have been doing is press all the wrong buttons, because that's what users do. I'm now on several beta prerelease programs and do my best to cock everything up in the most bizarre, unlikely and foolish ways, thereby more closely matching real-world usage.
If you remember Windows pre-95 the icon for the button at the left of the title bar had on oblong on it (I suspect it was an image of a space bar because I think some incantation involving space could substitute for it). It was said that a good tester could look at that oblong and see a minus sign.
That was the embuggerance of Mac OSX. You used to have this lovely step system of 1) Please restart the computer and 2) Can you please hold down the Option and Command key in order to rebuild the desktop? Never failed to solve 99% of problems.
Of course now we have 'Please hold Command S and at this stage I would like to know if you have any experience of UNIX systems as you will be asked to run fsck."
You must have done QA on Windows NT. On the machines we had at the time, it was possible to boot the machines, get the "Press CTRL-ALT-DEL to login", press CTRL-ALT-DEL and enter your login and password....before the computer was ready to accept said login... And F***ING Poettering has re-invented this for Red Hat with systemd-login...
I remember odd ocasions when some fumbling on the keyboard would invoke a shift lock. I never found out which key combo turned it off. I've always assumed it was something in the keyboard controller because unplugging the keyboard then plugging it back in would fix it. (A long while ago, as re-plugging a PS/2 keyboard not recommended and I don't recall doing it on a USB keyboard.
You are a born tester.
When I worked for ICL in the late 70s, we had a very senior guy in quality control who got first crack at any new software releases. The first thing he did was slap his hands down on the keyboard a few times and press 'Enter'. That got rid of more than 50% of every release without even having to start working through a script.
"the wrong buttons were pressed"
These days it becomes difficult to distinguish the button from anything else on the page. Or the button is further down the page, below the several inches of white space that extends to the bottom of the screen. Or work out which bit of text is a link and which isn't because the User Experience Designer has gone to considerable lengths to ensure they're the same colour.
All because of style over function, style is a matter of fashion and fashion dictates flat design.
Designing an interface takes knowledge, intelligence and thought. Anything cobbled together with crayons is an experience even if the adjective in front of it should be "bad" so a User Experience is substituted for a User Interface.
Place the blame where it belongs.
>> style over function
Oh yes. And it amuses me, in a negative way, that brands spend so much cash on hiring SEO snake oil gurus to persuade Google's algorithms to put them at the top of the SERP when Google's algorithms are simultaneously pushing them downwards because of UX and loading speed howlers. For the price of a SEO consultant, they could hire two web developers who'd fix the real problem *and* reduce bounce rate within days.
I feel another column coming on...
At the place I currently work, it was decided to introduce a new system for evaluations (which, in all fairness, seems to be working quite well).
A new in-house set of web forms was duly produced and clearly not given to average users to test.
The web form for recording these conversations (which is really what they are rather than the 'PDR' crap) had a bright rectangular piece of text on a contrasting background that gave no indication it might actually do anything (didn't highlight with a mouse over, cursor didn't change, no url in the status bar), but when I decided to check things out and click on said apparently inert object (the first conversations were the beta test effectively), I was whisked to a completely different page.
I had a bit of a vent about that one.
Rule 1: If a button can do something then make it clear with a visual clue.
That should even be "If a button can do something, make it look like a button". There are plenty of faint grey text boxes on forms that are designed so that they already contain text asking a question. When user clicks in this box the text may or may not disappear for the user to add their comment. In the not case this means deleting the pre-existing test..
But you have to know (telepathy?) that this will happen. Otherwise the user is left scratching their head and trying to work out where the answer is meant to go,
_______________________________
| What do you think of our company? |
|______________________________|
Well if your computers are anything like the ones we had when I was an NHS wonk, then they originally came with the minimum of RAM for XP or 7, then weren't upgraded when 10 was rolled out.
(Give them their due, our rollout team had a list of models that they replaced rather than upgraded...)
Where I worked if we had exhaustively (and exhaustingly) eliminated all possible other reasons for a one-off, irreproducible glitch then we wrote it off as a "neutrino hit on the computer processor". In other words we had managed to do something that none of the well thought out and designed experiments that have been running for many, many years had managed to do in detecting one of the little blighters! :)
Many years ago, I was working on an application that would be delivered to the users on a DVD. (Not too long ago: software delivery by DVD was already exceptional rather than the rule.)
First DVDs were created and they didn't work.
I couldn't reproduce the issue with a fresh build.
I retrieve the application from the staging area (where we assembled the ISO): no problem.
I download the ISO, extract the application: still no problem.
I ask for a new DVD to be burned from the ISO: it works.
To this day, I claim "cosmic rays" caused the error.
Well that sounds like a problem with the DVD itself.
I'm assuming the DVD you were creating wasn't a physically pressed DVD, but one written onto writable (or rewritable) media. Writeable optical media are notoriously tricky beasts. Media written with one drive may be perfectly readable on the same drive but unreadable on another. Because the media is written by basically burning holes in light-sensitive dye with a laser, they can be susceptible to corruption if exposed to a UV source. Scratches and dirt can render them unreadable, and in some cases bright sunlight.
There's a reason they're not often used any more, and that is that, unlike a commercially produced CD or DVD, which has physical indentations in a metal layer, protected by a layer of plastic, which is fairly robust, they are fragile beasties. They're certainly not suitable for any sort of long-term storage or backups (which is why we still use magnetic tape for this, some 130 years after the technique was invented).
I came across this wonderful quote from Douglas Adams yesterday, it seems amazingly relevant to this article...
“Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.”
― Douglas Adams
Now that you're preggers with Mrs. Brown's baby, what will you be getting the little squirt for it's first birthday? Will you get it a tablet so it can play KSP & become the next hero astronaught? Or perhaps it will play GTA so it can become the next cab driver? Or maybe you'll have to get it a really powerful one so it can play "SimIT Admin" & become the next technology tart! =-D
Why yes I *am* insane, why do you ask? TheVoicesInsideMyHead assure me that I'm perfectly fine.
*Giggletwitchcackle*
Oh look, it's time for my dried frog pills! Here froggy froggy froggy! HERE FROGGY!
*Cookie monster noises*
I've sworn through most of my life (verbally and metaphorically) that one day I will catch the compiler out. That I will be the one in the right.
The great news is that the moment finally came a year or so ago (and continues sporadically to this day) with Visual Studio. It's quite common now when working on C# to be able to make the compiler do it again because it got it wrong and it actually does get it right the next time. Mostly it's because it sometimes can't find the files it's supposed to be generating (.gs) but on several occasions it has just been plain wrong.
The funniest was the time it told me off because object didn't have a default constructor. I almost danced with joy while pointing at the monitor and giggling maniacally. Well, okay, thanks to working from home that is, actually, what I did. After 30 years of putting up with sanctimonious shit from compilers I feel entitled :)
I was working with a bare metal system using good old GCC with -wall as the default setting and as happens with bare metal, many pointers are loaded into the internal registers of the microcontroller*.
So I was happily writing said code and at compile time there were loads of warnings about assigning a uint32_t to a pointer.
I went back through, meticulously prefixing them with (uint32_t *) to do the typecast and get rid of the noise and added comments:
/* typecast to silence GCC's verbal diarrhoea */
My manager looked and said 'You can't leave that in there!' (I did).
* Modern microcontrollers have so many damn peripherals internally that the init code is an eye glaze inducing list of pointer assignments into the various peripheral registers.
Ha. Around 1990, the uni where I worked ugprade their computer. When the next semester began, the programs I was writing for CSS xxx (no idea what year or class it was) started to cause the computer to no longer generate executable programs. It turns out that the new system was so much faster than the previous hardware that the subroutine compiler could compile relatively small routines so quickly that the auto-generated (internal) routine names, which were generated based on timestamps, would generate two routines with identical names, that didn't impact the compiler for some reason, but the linker barfed every time. After opening a bug report and uploading a sample program, the first reply I got was "why would anyone do that" referring to my 1-line subroutine, followed by, i think, modifying the compiler to wait long enough for the internal clock value to increment (1/1000 of a second IIRC)
They've all got it in for me!
Immortal lines of Kenneth Williams as Julius Caesar in Carry on Cleo.
The scriptwriter was Talbot Rothwell, but those lines are actually by Frank Muir and Denis Norden, from their scripts for Take It From Here
size of the so-called 'Mini' continues
I have a friend who is a old-style Mini enthusiast.. he has been known to spit on the ground if someone mentions the BMW so-called Mini.
With good reason. The original Mini was designed as a small, cheap, easy to maintain car. The BMW reskinned 1 (AKA Mini) is none of those things.
"The virtual pub quizmaster isn't looking so credulous, though. He has just reminded me who classical Rome's first emperor was. He does not look in the mood to entertain the idea that I really had typed in A-U-G-U-S-T-U-S but my computer, on a whim, retyped it as J-U-L-I-U-S."
It could be argued your computer was right - "The Roman emperors were the rulers of the Roman Empire dating from the granting of the title of Augustus to Gaius Julius Caesar Octavianus by the Roman Senate in 27 BC, "
Julius Caesar, on the other hand, was an elected dictator, not an emperor.
(Seriously, no-one has gone full "Brian" with that picture?)
I worked on a passenger booking systems for an in-house airline, and the guy in charge liked to travel on it and reset the passenger lists to seat himself next to young blondes. All well and good until a senior manager had his kid bumped for one of the blondes and the guy in charge said "must be a bug in the program". I heard about this and laid into him verbally, pointing out that a glib statement like that might be his "get out of jail free" card but it was my professional reputation he was trashing. Needless to say, when the system was migrated to another platform a few years later, neither i or the guy in charge were chosen to migrate with it.
OK, try my second theory for size: that it's down to cosmic rays.
In the first phase, the support agent invents this excuse, or stumbles upon it while reading a nameless, off-beat IT publication. It's a miracle solution to that uncomfortable moment when the customer pointedly asks "why did it break?"
In the second phase, as the soul is consumed, the agent becomes irritable when asked this question, and the delivery becomes increasingly passive-aggressive.
Finally, when their humanity has been totally stripped, the response becomes part of a game between agents at adjacent desks, where each delivery is worth five points, and the loser pays the bar tab at the end of the evening.
This post has been deleted by its author
I always blame sunspots, it seems they are plausible - at least thats what I assume that facial expression means.
My partner recently added 20 pags to a word doc when trying to scroll - I'm sure we all have stories like this - but at least she didn't blame tech. Maybe that's for more advanced users.
The answer to your Wednesday/bird/L dilemma is easy. Your computer is learning the ancient Etruscian skill of reading the Haruspices and is warning you about some impending doom that is about to affect you/France/the world (delete as applicable).
Of course, like all haruspices, the interpretation is entirely in your gift. So - what's it to be? Small cup of fine doppio espresso is going to mysteriously fall off the table just as it reaches the perfect temperature and you are about to drink it? Small meteorite is about to strike and turn your house into something resembling the Night of The Living Dead?
Or merely that it's Wednesday, a bird has flown past and you pressed that dodgy L key again..