"the threats we'd made about the PBX guy's testicles"
so one may presume these were pre-Register days, therefore no handy and creative suggestions via Tales of the BOFH...
(nuke because we have no cattle-prod-induced electrocution icon)
Hello and welcome to a brand-new Register series taken from the front line of high tech and business, encompassing war stories from our readers and writers about clumsy contractors, slippery suppliers, and the machinations of management. It's tales of fail, "I told you so" moments, and of getting away with it. Kicking off is …
A friend was the tech honcho for a US TV station. They had built a brand new transmitter building, all new kit, several stations shared the building. There was a HUGE diesel generator outside and a fairly large UPS to carry the load while the diesel spooled up. Everything tested fine*.
First power failure comes, and everyone's off the air. Cables from the UPS to the generator had been improperly connected. Generator never started. Heads rolled (luckily, not my friend's)
*That is, the UPS tested OK, and the generator fired up when the "test" button was pushed.
Where I work the UPS systems are routinely tested by breaking the incoming mains. The guys who do it are quite good now and even check we have enough Diesel on site beforehand.....
Odd though what has been found over years when doing these tests e.g the Control system for the UPSes and the aircon did not have their own UPS. When it got one, it was then found that the batteries were not getting charged properly and on the next test it failed. Seems obvious after the event but it shows that sometimes the only was to test something is to do it for real.
Exactly. We built a new remote office.. increasing staff from 150 to 400+. Come the day for the moves, server, network, PC-types, all running around and doing last minute things while the movers moved everything from the old office to the new except the old hardware. We even tested the back up gens by killing main power at the building input (power company had to that part).
Things looked good at lights out Sunday night. Monday morning all hell is breaking loose starting at zero-dark early. Seems someone misconfigured a local switch to point any machine wanting to open a file on an MS server (local) to the IP of the mainframe at the main office. Took them 2 days to figure this out and 5 minutes to fix. Murphy is a mean son-of-a-b*****.
Witnessed a HP JetDirect box have its IP address set to that of the Default Gateway of the network used by all the user's PCs. Watching everyone lose connection meant a very frantic hunt for the offending device and a big stick for the button pressing monkey who typed in the wrong details.
Circa 20 years ago I worked for a firm that didn't use DHCP on their TokenRing LAN. When people were issued with a PC (most users were connected to a mainframe with dumb terminals) they got a slip of paper with their personal IP address details. I lost count of the number of times the Gateway and Host IP addresses were transposed, knocking the whole area offline..
Reminds me of the UKTV programme where they bought a pre-designed, pre-furbished modular home in Deutschland and after much trouble getting the concrete foundation laid, they found that the only Brit component of the assembly process had sent the bloody crane to the wrong bloody town.
We had a lovely powerful standby generator that performed flawlessly during every routine test powering the entire system with effortless ease. There was however one tiny problem, it was fed from a smallish header tank which was replenished by a pump system from the main supply. This filled the header and rather than mess about with float valves and the like, once the header was full any excess went back to the main store. Well that was the theory. The snag was a multi-way valve which controlled the various possible fuel flow patters. Sadly there was one position which stopped flows going anywhere. So the main pump overheated and died unnoticed, until that was, we had a 'real power cut' The generator cut in and took over only to splutter to a halt as the header tank was finally exhausted. The error of the valve setting was soon noticed and corrected. Getting a new fuel circulator pump took a lot longer...
I know of a similar failure although its exact location escapes me; a similar system to that described above always behaved perfectly on test, but failed fairly quickly when there was a real mains failure. No - one had realised that the transfer pump between the external storage and the running tank was supplied from the incoming supply, not the "post changeover" maintained supply. I'm not sure how long it ran for but with 3 or 4 big engines generating somewhere around 1.5 - 2 MW I don't think it was very long.
Best example of Murphy's Law in action that I ever saw was a diesel genny.
Test run every week it was. Once a month there was a full load cutover on a Sunday to prove the UPS and that the genny kicked in when required. Every six months it was fully serviced and the diesel tank was drained and refilled with fresh.
So, when there was an actual power cut do you think the sodding thing would start...........?
If it's worth installing ONE diesel genny, it's worth installing TWO diesel gennies. (and 2 fuel tanks, so you don't get stupidity as described above).
Split tankage is a no-brainer for long-term fuel storage. Whoever signed off on a single tank should be introduced to Mr RedHot Poker.
> standby generator that performed flawlessly during every routine test
We had one likewise. Except for the point of failure - the large diesel tank under the carpark didn't have anything measuring how much diesel was in it. So when it sprang a leak and 20K litres of diesel drained away into the subsoil no-one noticed. Until we had an actual power outage, the generator kicked in and quickly ran through the contents of the header tank..
After that, they put some sensors in the tank itself to measure how much diesel was left in it.
Your memory is almost fully functional, CDD! It was actually Network Week and I have proof of this as I have just dug out a copy of the 22nd October 1997 edition. The story I'd submitted for "This Damn War" was published on page 4 of that issue. Page 3 also had the BOFH column long before he'd taken up residence at el Reg.
In 1965 we had a generator to keep a 3-phase drum spinning during power outages. It blew a fuse every time the power went out on one of the phases so the drum kept spinning on a single phase. Yep - the phase rotation of the power company and the generator were different. It took some time for me to convince the college electricians what phase rotation was and to check it.