Re: Been there, done that...
Any code "that could not be reached" will be.
Just another variant of Murphy's Law
4428 publicly visible posts • joined 24 Apr 2007
Nothing sweary involving words (although some expletives were heard in my office), but I once got a weird error from a NAG (Numerical Algorithms Group) library routine for solving ordinary differential equations (ODEs). In this case I was using a routine with the perfectly sensible name of D02BAE (FORTRAN with its limits of 6 characters for identifiers at work). I was running several instances on the 32-processor Cray J932, and got an error message:
IMPOSSIBLE ERROR
Apparently, a negative value was found at some point in the loop where negative values should be impossible. The cause was fairly simple: FORTRAN doesn't deal with the notion of scope well, and data frequently needs to be shared through "common blocks". Apparently, D02BAE used some other elements of the NAG library, and shared data in a common block. However, in this case a single named common block was being shared by 16 different instances of D02BAE and its helper routines. so different instances of the routine were overwriting each other's data. They had forgotten to compile the library on our Cray J932 (long since dead) with the --taskcommon switch, which makes private copies of common blocks. There was no workaround for that (AARGH), so I wrote my own ODE solver.
I did a lot of (Modular) Pascal programming in the past, including writing code for an image processing system for microscopy. I might want to check out how easy it would be to port this stuff (I still have all the code backed up) to the newer version of Pascal. Could be fun.
I put some choice words in code comments when I had (yet again) to create a workaround for some bug in MS Pascal. One example was that in a linked list with an even number of of nodes, the statement
current := current^.next^.next;
(i.e. jump forward two nodes), and the code snippet
current := current^.next;
current := current^.next;
produced different results. The former caused random crashes, the latter worked flawlessly. I added some quite colourful comments at this juncture.
In a similar vein, the natural logarithm function was extremely inaccurate. I had to call the version in the MS C library to get correct results. Again, various swear words emanated from my office, and quite a few were included in the comments in the code. They didn't end up visible to users, but were available to any programmer bothering to read the documentation thoroughly.
I have had one weird issue with the power supply of a Leica fluorescence microscope in the 1990s. I was developing and supporting an image processing package that supported three different frame-grabber/image-processing boards, either the Matrox PIP1024(A/B variants), which were fairly basic, or the more powerful Matrox MVP/AT-NP boards, equipped with a neighborhood processor (NP), that could perform GPU-like stuff way back in 1992 (video-rate 3x3 convolutions and the like). Both PIP1024 variants run happily on the same software, but the MVP/AT-NP needed a different library linked to the executable. Three systems lived happily in the Department of Medical Microbiology, and another, with an MVP/AT-NP board was installed at the Department of Dermatology. This caused no end of trouble. Code that ran happily at the microbiology department caused crashes on the same hardware at dermatology. I got seriously suspicious when they people at dermatology mentioned that whenever the UV lamp's power supply was switched on, the computer crashed. They developed a protocol that they first switched on the power supply of the microscope, and then booted up the computer. Clearly, the power supply was causing spikes on the mains voltage when switched on. I then surmised that when my code ran on this fast processor, RFI from the power supply was at fault. Indeed, when the power supply was switched off, all my code ran sweetly. On a hunch, I linked the library for the MVT/AT (but not NP) board to the code for dermatology, and all was well. Bit of a bummer we could only use the expensive NP unit when the microscope was not being used.
No problems with coffee (or tea) fortunately.
I remember working at an Italian observatory in Switzerland on my BSc thesis project, and the dry air at 3200m altitude caused a lot of static electricity. I was given strict orders to earth myself by grabbing a metal handle on the desk, each time I wanted to type something. Now I had to issue commands to the scope control software every few minutes or so, due to the tracking system being a bit wonky on the 1.5 m aperture infrared scope. I had to keep a guide star centred in a little box I would have to draw on the CRT screen with a whiteboard marker pen before each observation, by issuing short commands to steer the scope up, down, left or right. Even sitting still for a minute or two, you would build up enough static electricity to get a nasty shock. I never blew up a keyboard (which previous astronomers had done) or worse, but it was far from pleasant.
Donkey's years back as a student, I was visiting a friend who complained that her cassette tapes seemed to degrade. One glance at her stereo set-up showed she was storing her collection on top of one of the speakers. Not the best place. When I suggested that the proximity to the speaker could be the cause of the trouble, she moved the tapes, and put them on top of her (classic, CRT-based) TV, which I suggested was also sub-optimal. She was certainly smart enough to understand my explanation involving magnetic fields, and found a safe spot for her cassette tape collection on a bookshelf.
Reminds me of a tour I once gave a group of international students in our HPC centre. One of the main machine was a Cray J932, which was quite an impressive box, with a huge rectangular green power LED and below it, well recessed, a reboot or power button (I forget which). One of the students asked what would happen if he pressed that button, whereupon I stated that a little metal claw would come out and snip the offending finger off. I added that if it didn't, I would get a pair of pliers and do it myself
I related a tale of the mouse tail pointing the wrong way previously. I have also seen a user with non-functioning mouse due to inserting the tail into the computer the wrong way round. This was an old RS-232 connected mouse, and the damage done to the pins on the mouse by inverted insertion into the PC was terminal. I was not amused. I have seen VGA plugs inserted the wrong way round before (that seemed to require less force) but getting an RS-232 plugged in the wrong way round was new for me. The user wondered whether I could solder a new RS-232 plug onto the mouse. I stated I could, but I would charge more than the price of a mouse for that. That ended that discussion effectively
I have been following Majorana fermion claims for quite a while, and I have yet to see convincing evidence. As I understand it, the Majorana fermion is more properly considered a quantum state than a fundamental particle (though I may be wrong, it is years since I took my courses in quantum mechanics), and this adds to the confusion. As the Majorana fermion is considered to be neutral and its own anti-"particle", I am given to wonder if it doesn't self annihilate easily. Whether or not it exists at all, it is certainly elusive, and the jury is still very much out on this one. The evidence will have to be pretty solid before such a huge claim can be believed.
I will follow this saga as it unfolds.
Absolutely. I made sure no one else was within earshot (or even in the room). Just quietly pointing out the items were being used the wrong way round was sufficient to cause the blush. Note that she did first accuse me of borking the image processing system with a software update, which in her opinion had caused the errant behaviour of said rodent and camera. My protestations that the software update could not possibly be the cause were in vain, so I trundled over to the lab to see what was up.
She could laugh about the episode later.
I have only ever made a user of computer systems I maintained blush (a fetching shade of crimson) was by pointing out that the mouse cursor would move the right way if the (wired) mouse had its tail point away from you when you moved it around, and ten minutes later pointing out that the image captured from the microscope would be the right way up if you rotated the camera 180 degrees. I think she didn't dare report any issues for at least a month after that
I remember seeing the launch of both Voyagers on TV as a teenager at school. I followed every planetary flyby in the National Geographic Magazine. Amazing that the better part of a human lifetime these probes have been speeding through space, collecting loads of data.
I will (again) raise a glass to the team behind this amazing achievement.
When programming in INTERCAL the frequent use of "PLEASE" is actually required: between 1/3 to 1/5 of statements in INTERCAL need to be polite. I have never used INTERCAL (though I do point my first-year students to it for a bit of fun), but must assume the compiler or interpreter will resort to sulking in basements if you aren't sufficiently polite.
I once worked on a visualization package for CT and MRI scans, and we tested our iso-surface rendering tool on a public, anonymized data set of a CT scan of a girls head. At the appropriate threshold level you could readily visualize the bone structure and observe a huge hole in the bone (we initially thought it was a bug in our code), where some bone-destroying bacterial infection had done its gruesome work. All well and good, but at another threshold setting, you could visualize skin and hair, showing her face clearly. Now I did not know who this person was, but we decided not to use this data set for visualization labs, on the off chance that some students might see a familiar face.
We have seen this process here in our BSc Computer Science programme, where quite a few students do well in their homework exercises, but fail dismally at the mid-term and final exams, where they cannot get access to AI tools. We warn them of this, and some do listen, but too many don't. I feel that most, if not all homework should really not contribute to a final grade, and just be offered as a formative test, allowing students to learn how well they are doing (and lecturers to learn where there may be problems). If they use AI tools, the only people they are really cheating is themselves.
This is not to say AI tools are only a bad thing. If you want to focus on the knotty parts of a problem and can use an AI tool to solve some simpler components, little harm is done, although you still have to check and understand the AI solution. That does involve reading and understanding code written by someone else. That can be a real pain, even when the someone else is your few years (or even months) younger self.
When I was doing my PhD research, I had a habit of doing weekly back-ups of my development (MS-DOS) machine in duplicate on a pile of 3.5" floppy disks (talk about a tedious chore). I would then restore one of these on a bigger "production" image processing machine, thus testing and verifying the back-up. After that, the whole shebang was copied to tape. I am not sure if the tape jockey verified anything, but for good measure I took the other back-up home, and restored that on my home machine. Paranoid? Perhaps, but I didn't lose any data during that time.
Switching off is an important skill. Some colleagues wonder how I find time to do my hobbies (such as salsa dancing) and my answer is that any time outside working hours is mine to use as I see fit.
Before holidays, I set myself a deadline beyond which I am on vacation, and a second deadline after which I am back at work. I order my remaining tasks by their respective priorities, work on them until the deadline, and anything left on the list is postponed until after second deadline. Somehow, nobody ever complains, as I have found out.
Time on earth is your most precious commodity, don't let others waste it for you.
Reminds me of the tiny little postscript file that could keep a laser-jet printer occupied for hours, and then output one beautiful, A4, 300 DPI, colour Mandelbrot image. This made use of the fact that postscript is Turing complete, so of course you can write a Mandelbrot program in it. And sure enough, someone did. To the delight of some, and chagrin of other, I might add. In part this was motivated by the fact that the printer had more memory and a faster processor (RISC, as I recall) than even a power PC back in the day. In part it was motivated by the "let's try it" attitude that has lead to many great things, many more wasted hours, and occasionally assorted disasters.
There is also a postscript ray tracer, which has a similar effect.
Reminds me of some ancient BOFH shenanigans where several people with names like Charles Omputer and Roger Amchip somehow drew pay for various obscure jobs.
Great mission which will keep some colleagues of mine busy for quite a while, as the next data releases appear. They are studying all sorts of streams of stars that are remnants of earlier mergers of smaller galaxies with our own.
I might well drag out the telescope (trusty old Celestron C8) to bid GAIA farewell.
I have set my language defaults to English for years, even before AI made matters worse, because I found the Dutch versions of menu names were quite frankly a bit puzzling. Not because my Dutch isn't up to snuff (I grew up bilingually), but simply because the chosen words and phrases seemed awkward. The downside of setting the default to English is that various programs now offer endless translations of Dutch websites (and guess what: they are rather poor), whereas I much rather read them in Dutch. Even when I repeatedly told google I don't want it to translate Dutch pages, it still pesters me.
And don't get me started on Acrobat reader and others offering me AI tools to help me (with what exactly? I just want to display and read this file). I really wish there was a general AI toggle switch (the "No thank you, I like to think for myself" option) that bannished all copilots and other plastic palls who are fun to be with to the nethermost circle of hell.
End of rant
(for now)
Not sure it was a syntax problem, but our mail system developed a tendency to block everything from the IEEE Computer Society (of which I am a member, so it shouldn't count as spam), and, worse still, anything from NWO, which is the main Dutch funding body. Not only was I in one of their assessment committees, I was also preparing a very large funding proposal (to the tune of 1.5 million euro), which I did not want to go AWOL due to some email cock-up. It got so bad that NWO would call me to see if the emails they sent me had arrived. Fortunately, I could put two rules in place, one preventing the spam filter from touching the IEEE emails, and one for NWO. Of course I gave these rules the highest priority (5), higher than the regular filter (3), and all was well.
For a time.
I then noticed things went missing. I inspected the filter settings, and somehow the priorities had been reset to the default (3). I set the priorities back to 5. This happened twice. I then sent an angry email to the system administration that if this happened again, and I missed a deadline for responding to reviews due to this, causing me to miss out on the funding, I would send them bill to the tune of 1.5 million euro.
That seemed to have sorted the issue.
I have on occasion found bugs in students' code in a few minutes after they had struggled with it for days. I often just scrolled up, because I knew the error wasn't where compiler or debugger flagged the problem, but a bit earlier. Students were often very impressed, but I always reminded them that the reason I knew where to look was simply because I had made that kind of mistake far more often than they had.