back to article We have redundancy, we have batteries, what could possibly go wrong?

A Register reader finds the inevitable single point of failure after a call-out to the heart of darkness in this week's On Call. Our story, from a reader we will call "Philip", takes us back to the 1990s. It was a heady time of hulking on-premises servers, demanding customers, and an in-house call center. "What could possibly …

  1. Giles C Silver badge

    Almost

    A few years (and companies ago) we had to run the site on the generator because someone had stolen the busbars from the nearby substation. How they did that without getting fried I don’t know.

    It was realised that as we were waiting for the power to come back on we were a bit short of fuel for the generator. I think I remember hearing it got down to 20 minutes life, and this bit is a bit vague as it was years ago they might have driven the minibus to the nearby petrol station (which still had power) filled it up and come back and transferred fuel to the generator tank whilst waiting for the fuel truck to get there.

    1. GlenP Silver badge

      Re: Almost

      It could have happened to us.

      Previous office location had had dodgy power in the early days so a fixed diesel generator was installed. It was never needed in anger from when I joined the company (the building works that caused the problems were long completed) but someone duly tested it every week or so, until it failed through lack of fuel.

      Nothing more was done until we were holding an important seminar when 40l of diesel were duly purchased.

      As far as I'm aware testing continued until we moved out, without any further fuel supplies, so I've no idea what the run time would have been if we'd ever actually needed it.

      1. Caver_Dave

        Re: Almost

        I was recounting a fuel empty generator story to my daughter last night. In our case it lasted about 2 minutes before it spluttered to a halt!

      2. sebacoustic

        Re: Almost

        slowly draining the fuel tank aside, the other thing weekly testing does to a diesel is wear out bearings etc pretty badly because you're running the machine from cold all the time, when its lubrication system is designed to run nice and hot.

    2. Sleeper

      Re: Almost

      I was an anaesthetist at a hospital on the outskirts of London.

      The power supply was obviously a concern should it fail. Once a digger went through the mains cable and the backup generator kicked in. So far so good for emergency plans.

      Then the generator got updated and testing showed that all worked, but getting back on to mains supply didn't work well. As I understood it from the Works department it was to do with the alignment of the current phases. Many months later the problem was resolved by software and other tweaks.

      So a safe hospital one might think...?

      Then one night the local scrap metal dealers took away the thick electrical cable from the emergency generator to the hospital.

      1. gfx

        Re: Almost

        Yes, you need a synchronizer between emergency power and the mains if you want a seamless changeover. Otherwise switch off the generator first (it will go dark) and than switch on the mains.

  2. Red Ted
    Facepalm

    Cinema Generator

    I know of a cinema in Reading about 20 years ago that had exactly the same problem.

    Eighteen months after the site had opened they get their first big power cut. The cinema goes dark, then the generators start up, the lights come back on and the feature film continues. Ten minutes later the lights go out again and stay out. Site services had been running the generator for a few mins every week for that eighteen months, but never replaced the diesel they had been using up!

    1. Anonymous Coward
      Anonymous Coward

      Chip Fab Generator

      I know the Raytheon chip fab in Andover Massachusetts had this happen in the early 2000's. The area had a blackout and that's when they found out generator was out of diesel from no one filling the tank despite the generator being tested regularly.

  3. ColinPa Silver badge

    Aircon and networks

    I remember a site where a digger managed to fracture the aircon main pipe between DP room and the aircon plant. Having dual power supplies etc didn't matter if the kit gets to hot to use.

    I also heard in France where a company had two external network. One went via the north of France, and one went via the south of France. Which worked fine, till the telecoms company put a major backbone across the middle. So the northern network went north, then south to the backbone, across France, then north, then south to their office (and the mirror image for the southern network). This all worked well until there was an outage on the backbone, and both networks were impacted.

    1. Peter Gathercole Silver badge

      Re: Aircon and networks

      I've related this story before, but in the '80s, I was working at a campus site some distance from the nearest large town. The power was always a bit flaky there, so they had a decent UPS with diesel generator that would even keep the Amdahl mainframe running.

      So one afternoon, there was a violent thunderstorm, which lasted several hours. There were repeated brown-outs on the power, and each time, the UPS cut in and initiated the diesel generator start, but the power returned before the generator was stable enough to provide power.

      This does not seem like a problem, but one of the things that happened when the UPS cut in was that the air conditioning in the machine room stopped (it was not on the UPS by design), and then took several minutes to restart when the power returned.

      So each time there was a brown-out, two things happened. The available power in the UPS started dropping, and the machine room temperature started rising.

      Eventually, it was decided to shut the machine room down, because it was getting too hot, and would completely exhaust the UPS. And everybody except the core operations staff were told to go home, as they could not really do any work without the systems.

      All this could have been avoided if there had been a manual override on the generator, to start it and leave it running, but nobody had thought that that would be required when the system was designed. It was soon added afterwards, though.

      1. Anonymous Coward
        Anonymous Coward

        Re: Aircon and networks

        "All this could have been avoided if there had been a manual override on the generator, to start it and leave it running"

        Or if someone was smart enough to shut off the main breaker feeding the utility side of the transfer switch. That would emulate a power failure. Leave off until the server room was cool and the UPSes were topped off, or until the storm passes.

        1. msobkow Silver badge

          Re: Aircon and networks

          Ding-ding-ding-ding! We have a winner for "simple but should have been obvious" solution of the week!

          1. Charles 9 Silver badge

            Re: Aircon and networks

            Bzzzzt! They'll forget the breaker was hit and scratch their heads wondering why they're not back on mains when the power comes back on steady.

            Murphy strikes again...

        2. Peter Gathercole Silver badge

          Re: Aircon and networks

          I can't remember, it was 30+ years ago, and I was not on the operations side of the support staff, but I'll bet that the main breaker was a "must only be operated by qualified staff", and that they didn't have resident electricians to do that.

          Mainframes back then drew quite significant amounts of power, and even then, some facilities were outsourced. Big UPSs were quite novel, and probably weren't as sophisticated as they are now.

  4. Pete 2 Silver badge

    One step too few

    > Backups were performed and rotated and once a week

    Necessary, but not sufficient.

    Did they ever check that the backups could restore their systems?

    In practice a much more difficult task than many would imagine as you need a complete identical set of servers (plus other infrastructure) to restore to.

    1. Anonymous Coward
      Anonymous Coward

      Wait. Backups ned to be restorable?

      I remember one customer who was decomissionig their Mainframe about Twenty years ago, and they very carefully took backups of everything to tape.

      Tapes for which the only available drives in a 3000 mile radius were the very ones that were packing off to the landfill the week afterwards!

      They were doing this I may add after their support contract & software licenses had expired, so strictly speaking, wehereas they owned the hardware they shouldn't heve been using the operting system. Their fix for this little legalistic problem was to bar all employees of the mainframe provider from the site until they were finished.

      1. Doctor Syntax Silver badge

        Re: Wait. Backups ned to be restorable?

        Presumably there was a separate migration to a new system so the backup would be belt-and-braces for an extreme situation. If there were suitable drives almost anywhere on the planet this could be acceptable.

    2. GlenP Silver badge

      Re: One step too few

      Way back in the mists of time working on DEC Vax we had a total loss fire. Fortunately we had backups so a temporary replacement system was procured and located in a site hut near enough to the comms room, which had survived, for the hardware people to run the necessary cabling.

      Unfortunately the replacement was specced (not by me) with compatible drives not genuine RA81s* which turned out, when formatted, to be ever so slightly smaller meaning most of the restores failed. The supplier ended up putting in larger compatible drives to resolve the issue as the 21 year old wet-behind-the-ears IT guy (me) refused to sign off their 6 figure invoice.

      *For the youngsters, 456MB, 19" units, 3 to a cabinet IIRC.

      1. 42656e4d203239 Bronze badge
        Pint

        Re: One step too few

        >>*For the youngsters, 456MB, 19" units, 3 to a cabinet IIRC.

        IIRC you could get 4 in an 11/750 sized cab - ours was a RA81 higher than the Microvax cabs (which also had RA81s) (The microvax 2 was shorter again)

        I recall having to extract the frisbees from the Alu drive shells when arriving on site to hear the dulcet tones of drive heads grinding their way through platters... on more than one occasion; we had a generator but no power smoothing - so that went about as well as you might expect!

        Those were the days - real servers in real server rooms... with real terminals, LN06 printers, HP pen ploltters (no I definitely didn't write a plotter optimised mandlebrot set generation/plotting program in FORTRAN. Nope. Definitely not.) and the odd DecWriter thrown in for the 11/750

        /me gets his pint of nostalgia out....

        1. SImon Hobson Silver badge

          Re: One step too few

          Ah, a mate at uni had a PDP/8 in his bedroom !

          A lecturer in the physics labs had refused to let it go because it was so ideal for teaching the fundamentals of "how stuff works" - toggle in stuff on the switches, step it, see the results on the lights. While he was away on vacation, they wheeled it out onto a landing and put a notice on it "free to good home" !

          So my mate had chad all over the floor, and whenever I went round, we'd be rewinding punched tapes by hand while we talked - I guess it was the male techie equivalent to those women who can knit while they chat. But this was no ordinary PDP/8 - it had hard disks, two of them, 32k each (don't know if that was bits or words, between them they took a few feet of rack space), a high speed punch, and a high speed reader - in addition to the floor standing keyboard/printer/punch/reader terminal. Three full height 19" racks - good job he had large rooms.

          1. Gene Cash Silver badge

            Re: One step too few

            One of my neighbors at uni had a PDP 11/34A. Never had to pay a heating bill!

            1. kmceject

              Re: One step too few

              My first job had a PDP-8E as I recall. It wasn't in use but was in three racks in the back of the room. My boss told me he started the business in his house in Chicago with that machine in his kitchen and never paid a heating bill while it was running. He said when it broke down the DEC repair man would wash the boards in the sink with dishwashing liquid to get the grease off! Several years later when the place was absorbed by another company I checked out the machine which was being sold for gold scrap (hundreds of big, gold-plated pins!) I found that it was still powered up and there was definitely grease inside it. I kept one card and a fixed head disk platter that I used for years as a mirror.

        2. Ian Johnston Silver badge

          Re: One step too few

          HP pen ploltters (no I definitely didn't write a plotter optimised mandlebrot set generation/plotting program in FORTRAN

          On of my projects as a postgrad who used the departments HP plotter a lot was a utility which took an HPGL file and modified it to do all the plotting required from each pen in order - ie all the black stuff, then all the red stuff, then all the green stuff and so on. Hugely sped up the plotting process, but a little awkward because the files often had a mixture of absolute and relative positioning commands, so the first job was to make everything absolute.

          1. Missing Semicolon Silver badge
            Happy

            Re: One step too few

            HP pen plotters. I got fed up at Uni of graph paper that had the wrong number of divisions (or decades for log axes) so wrote a Fortran program that would plot custom-axes graph paper! All my graphs filled the sheet.

            1. John Brown (no body) Silver badge

              Re: One step too few

              Similar here, but in my case, a custom designed year planner! None of the ones the company supplied, or the freebies from suppliers, had on what I wanted, or space for the info I needed to put on it. And no, doing all that on a computer back in the days wasn't a convenient option. A "hard copy" year planner you can take in at a glance was (and might still be, but I don't need one in $current_job) far, far superior than what could be seen on an 80x25 text screen :-)

            2. Doctor Syntax Silver badge

              Re: One step too few

              Normal distribution blobs to stick on the C14 age/depth section at the end of hand-drawn pollen diagrams. But with a 1804 & Calcomp plotter.

              1. Doctor Syntax Silver badge

                Re: One step too few

                Oops. 1904

      2. SImon Hobson Silver badge

        Re: One step too few

        Ah yes, the "supposedly same size but not quite identical" replacement drives. Been bitten by that a few times when sent a replacement for a RAID member which is a few blocks smaller and so can't be put into the array. More fun when you are "discussing" the problem with a supposedly technical person who can't understand that his replacement "9G" drive is smaller than the original "9G" drive - but they are both "9G" he says ...

        So for a long time, I've tended to make my arrays a few blocks smaller than the drives to give a bit of "wiggle room".

        1. druck Silver badge

          Re: One step too few

          You get the same problem with SD cards for the Raspberry Pi. A 16GB card can be anything from 14.6GiB to 14.9GiB, so if you want to be able to clone the card easily, don't let the last partition go past 14.5GiB.

          1. Charles 9 Silver badge

            Re: One step too few

            As long as you don't use that much of the card, yes. Thankfully, there's a Linux program (pishrink) able to shrink a Pi installation dd'd from the SD card. When restored to a new card, it automatically refills the remaining space.

          2. A.P. Veening Silver badge

            Re: One step too few

            You get the same problem with SD cards for the Raspberry Pi. A 16GB card can be anything from 14.6GiB to 14.9GiB, so if you want to be able to clone the card easily, don't let the last partition go past 14.5GiB.

            Just first make a backup with DD and run that through PiShrink, the first boot of the clone will expand to the available space.

        2. gnasher729 Silver badge

          Re: One step too few

          Probably over 30 years ago, Apple sold Macs with various 40 Megabyte hard drives. The actual capacities were between 39.5MB and 41MB, but they all came formatted as 39.5MB, so if one broke down, you could replace it and the replacement would work.

          Until some people got smart and formatted their 41MB drives as 41MB, and when they broke down they couldn't replaced with most 40MB drives...

          1. John Brown (no body) Silver badge

            Re: One step too few

            Only really an issue if you are doing an image copy of the drive. If you are re-installing the OS and restoring files, it's not an issue unless the failed bigger HDD was completely full.

            Backups are not usually 1:1 images of the source data because backing up all that empty space takes time, even if it does compress (which it likely won't since much of the "empty" space is almost certainly no zeros, ie never users sectors.) That might be less of an issue on modern hardware with on-the-fly compression, but back in the days of 40MB HDDs, that could significantly add to the time. I'd still not like to dd an 8TB HDD, even with compression, when it's 50, 60, 70% "empty". That's still a lot of wasted reads.

            1. SImon Hobson Silver badge

              Re: One step too few

              I vaguely recall that back then there were few backup options for Macs, and several of them did simply image the drive. PITA to restore from as well.

    3. Doctor Syntax Silver badge

      Re: One step too few

      If you use the services of a DR company the greatest need is to check restore on their H/W. The experience of your first test - or rehearsal if you prefer - can be quite informative. The first time I tried it we ran out of time just about at the point where we had a file system system we could log onto as /etc had been backed up after a lot of stuff it would have been less urgent to restore. Things were changed round so that next time we were in a position to start restoring the database fairly quickly.

      1. Peter Gathercole Silver badge

        Re: One step too few

        I was involved in creating a DR plan where we were told up front that the DR company would provide systems of the right architecture and of suitable size, but were also told that it would not be a like for like replacement (this was fair, not many DR companies would keep IBM SP/2s in their inventory).

        But it did make the design of the plan a lot more complex, especially as while they could provide compatible tape drives, they could not provide a suitable tape library.

        So, we invented a way (this was before official methods were published by IBM) to write mksysb images of the TSM server to non-bootable tape through the control workstation (no directly connected drives in the SP/2 nodes!) which would be installed using a copy of the install CDs, to provide any missing drivers for the new hardware. It was really strange during the development to take an image from a Wide SP/2 node, and install it on a little model 7011-220 desktop system with almost no memory and much less disk space!

        During the DR test mandated by the 1998 electricity de-regulation process, because there was no automated library, we ended up putting all 200+ tapes from the offsite DR set out on a table with everyone and their IT director taking turns to find the tapes to feed into the tape drives on request from the TSM server.

        And it worked! (apart from a slight wobble regarding the TSM default admin password used during the DB recovery). I don't know how true it was, but the aforementioned IT director boasted that we were the only electricity company who passed the regulator monitored DR test first time.

        One thing this taught me was they you really want the NIM server and your TSM server on separate hardware, so you could get these up first to bootstrap the rest of the AIX environment, regardless of how complex it was, a mistake I've seen in too many environments.

        We used the same process some time later to construct a duplicate environment when the supply and distribution arms of the company split, and we had to move one of them to a new location.

        Happy times.

    4. Anonymous Coward
      Anonymous Coward

      Re: One step too few

      Do you fancy trying out if the National Grid's backups work... Taking stuff off to test at full-scale is rather problematic!

      Isolated components are of course tested periodically. But an all up test, the first oppo for that will be a full blown black start. Not even the 1987 Hurricane was enough to need all that.

      1. Anonymous Coward
        Anonymous Coward

        Re: One step too few

        if you ask Putin nicely, he will provide some specialists able to perform a full shutdown of the grid by disabling strategically located substations...

        1. Anonymous Coward
          Anonymous Coward

          Re: One step too few

          Literally counting the days. Scary times IMO.

  5. WanderingHaggis

    I'm not slow -- you just have to explain for a long time.

    I remember our big outage -- phone alert -- rushing to the site and knowing most of the battery life would have gone by that time I got there. I start shutting everything down cleanly until I suddenly realised that the lights were on again. (It was during the day so there was some excuse.)

  6. jake Silver badge

    Single points of failure always do.

    I landed a contract to install two big, garage sized, Memorex tape backup robots at a large number-crunching outfit once. Before I bid on the job, the VP of operations gave me the grand tour. He was proud of all his redundancy. He had two power lines coming in to two separate rooms, with a motor-generator, a large battery consisting of dozens of telco-style lead-acid batteries, a generator, and monitoring systems for each room-full of gear. The 48 Volts was switched by a box at the corner where the two rooms met, brought into the main building via a 5" conduit, where it was switched to two separate computer rooms. Even the links between outlying offices were redundant T-1 and T-3 lines. There was a third "data center" that was dark, to be used for spares "just in case". It was designed to provide non-stop operations, and it did a pretty good job of it. Even the Halon had built-in redundancy.

    Until a semi-truck carrying some of my Memorex kit backing into the receiving dock went off course & cut the 5" conduit. The security cameras caught the sparks quite nicely.

    Two weeks after getting the tape robots installed and signed off, I had a proposal for a more geographically diverse version of the same thing on the VP's desk. I didn't land that contract, alas.

    1. JimboSmith Silver badge

      Re: Single points of failure always do.

      I had a routine visit to the small Racks room at one employers, I say small because there was a much larger one elsewhere. That supported most of the building, the little one just looked after one floor, the one with the most number of users/computers. On this visit there was a non tech manager in there with a couple of guests. I said has anyone touched anything and they all said no. I picked up the phone to call building maintenance and the manager said wasn’t it a lovely temperature in there today? I said no it’s bloody warm and then asked very politely for someone to get up there and fix the AC from maintenance.

      Manager has noticed something else on the power supply cabinet on the wall. The cabinet has a main and reserve supply going into it and lights to match. What he points out is that there are only three of the four lights lit on the cabinet. I said that there at most there should only ever be three lamps lit. The left hand lamp in each pair indicated there was power from that supply. The right hand ones indicate if that supply is being used. It should switch over automatically if the main supply fails. You don’t want to see both supplies on at the same time I patiently explained. He looked very puzzled and went back to his guests.

  7. Admiral Grace Hopper

    Dipstick

    Even the best-considered systems can fail due to the improper use (or absence of use) of the the simplest piece of equipment.

    1. John Brown (no body) Silver badge

      Re: Dipstick

      Yep. The number of times we see this same story, either as a Who, Me? or an On Call, followed by the numerous Me Too comments, it's clear than anyone installing a backup generator MUST include checking the fuel levels as part of the checking/testing procedure. It's obviously a relatively common issue, enough such that the suppliers and/or installers should be passing on this information to the customers as a "just in case you didn't think of this as a potential problem" sort of note. Maybe a big sign on the generator on/off test switch too. Likewise, another problem mentioned here is the diesel fuel going off. 6-12 months is the expected shelf life IIRC. If that cost seems like too much to bear as "waste" (note, beancounters will question the "waste"), then use the genny and save on the leccy bill on the day the tanker is due to refill it.

      1. A Nother Handle
        Joke

        Re: Dipstick

        You're just asking for the fuel delivery to be delayed, followed by a power cut.

        1. Anonymous Coward
          Anonymous Coward

          Re: Dipstick

          We had double backup generators and plenty of diesel on site and in tanks.

          Still got hit when temperatures reached -30 (in Texas) Lights went out for a few days, our diesel reserves froze, additives could be delivered by energy companies: everyone's diesel had frozen.

          We had two backup sites in pleasant climes but what got us in the end was lack of connected wetware to execute fail over procedures.

          Procedures are automated, but operators and customers need to press the buttons (for good reasons)

          Ultimately you have to balance redundancy costs with the cost of lost business in an outage.

          You can't win em all.

      2. Corp-Rat

        Re: Dipstick

        All of the Generators at the sites I look after (except the smallest sheds) are specced with a day tank (enough fuel for one day) and a bulk tank (enough fuel for 5 days at full load). part of the routine checks is "how much fuel in the tank" and "how much water in the fuel". Most of our diesel is much older than 12 months and is perfectly fine.

  8. Doctor Syntax Silver badge

    I was told this story yesterday.

    Once upon a time we had a county Council*. They used to use snow ploughs to clear rural roads* in winter. Because some of the roads are steep and snow ploughs are less likely to get stuck coming down hill they decided to park one at a convenient spot at the top of a hill. When the sown came some poor council worker fought his way through the drifts to the top of the hill. And found the snow plough''s diesel had been stolen.

    * This now seems like a fairy story, hence my use of the traditional opening..

    1. Admiral Grace Hopper

      Rural folk don't steal from their neighbours, after all we're all in the same boat, but council gear was paid for us all via the rates* so it's fair game.

      * or poll tax or council tax, depending on how far back we're going.

      1. jake Silver badge

        "Rural folk don't steal from their neighbours"

        Transplanted city folks sometimes do. Once.

        1. Admiral Grace Hopper

          In my head there's a banjo playing. For the UK readers I live not far from Royston Vasey.

          1. Korev Silver badge
            Gimp

            Do you have lots of Precious Things from The Shop

            Icon to reflect "The Medusa" in the third series...

    2. jake Silver badge

      I'm sure it happened somewhere.

      I had someone steal the fuel out of a riding lawnmower once. I had topped the thing up (three wimpy US gallons), then headed back to civilization, intending to return the following weekend to mow the verges of our access road. I got back two weeks later and discovered the gas cap removed and the tank empty ... but everything else was exactly where I had left it. This was down a 5 mile driveway, behind three locked gates.

      Judging by the tire marks, it was three kids on dirtbikes.

      1. Doctor Syntax Silver badge

        "three kids on dirtbikes"

        Does your hunting licence cover these?

        1. jake Silver badge

          No. They don't offer stamps for kids on dirtbikes.

          On the other hand, it's nearly always tourist season ...

      2. Peter Gathercole Silver badge

        Stealing more than fuel

        Nowadays in the UK, you would be really lucky to just lose the fuel. There have been a spate of farm robberies over the last few years where the thieves turn up with a large powerful pickup, and chain anything they can find together, especially quad bikes and trailers, in a sort of road train, and drag it off the farm to a convenient quiet location to disperse more conventionally.

        Have even been known to take small tractors this way. It it's not chained down, they'll try to take it.

        1. Doctor Syntax Silver badge

          Re: Stealing more than fuel

          Yes, the neighbours had multiple CCTV fitted quite a while ago.

          1. John Brown (no body) Silver badge

            Re: Stealing more than fuel

            Has it been stolen already?

          2. jake Silver badge

            Re: Stealing more than fuel

            I have cameras (and two-way voice, if I need/want it) out there now. Back then such things were cost prohibitive.

        2. John Brown (no body) Silver badge

          Re: Stealing more than fuel

          A local haulage company was in the news just yesterday. £24,000 worth of fuel stolen overnight through what the security guard said "looked like a garden hose". Mind you, if they'd stolen it last week, it would only have been worth about £20 grand! Fuel is like bitcoin at the moment. Vast fluctuation in value and hard to trace :-)

          1. KBeee Silver badge

            Re: Stealing more than fuel

            A now defunct company I used to work for had a compressor stolen - one of those you used to see being towed behind a lorry to power pneumatic drills.

            About 8 months later a colleague was on holiday in Malta, and there was the compressor being used to dig a hole in the road. Still with the company logo on, they never even bothered to repaint it. The photo he took made it into the company newspaper.

        3. uccsoundman

          Re: Stealing more than fuel

          > The thieves turn up with a large powerful pickup...

          They have Large Powerful pickups in the UK? I'm actually astonished. From what I've seen on TV the biggest pickup trucks I've seen appear to be about the size of a Tuk-Tuk, and would easily fit in the bed of the average size pickup truck here in the US.

      3. Sam not the Viking Silver badge

        Enterprising thieves

        We had an installation out in the middle of nowhere and the contractor needed to lay the new supply cable from the transformer which was on the edge of a far off field. About 1 km of 3-phase SWA, 50mm2. An appropriate trench was dug across the fields to this installation, cable placed and all prepared for back-fill.

        That night the whole length was stolen.

        After a delay a new drum was obtained and after laying, the trench was immediately back-filled. Two days later, thieves were disturbed and they fled but not before they had removed the cable and cut it into handy 3-4 m lengths for loading onto a truck.

        We felt that their expertise in handling "bloody-awkward" cable could have been gainfully employed almost certainly at a better rate than the scrap value of the cable.

        1. Terry 6 Silver badge

          Re: Enterprising thieves

          This has been a common thought for a long time.Most thieves tend to be small time, small brain criminals. There are, however, some brighter ones who are clearly using considerable business acumen, management skill etc. And who'd almost certainly be capable of earning a damned sight more, with far less risk of ending up behind bars, or worse- dead- by having gone into a straight business.*

          *There are of course the public school boys with friends in high places, who may not be too bright- but get the best of both worlds.

        2. Norman Nescio Silver badge

          Re: Enterprising thieves

          Oh yes, I've seen this before.

          The technique is to bring a tractor with a lot of torque and firmly attach the cable to the tractor, go down the path of the cable to a suitable access point (manhole) and cut it, or if none is available, dig your own access shaft and cut the cable, then simply pull the cable out. It works on fresh infill as the earth hasn't settled fully yet (Even easier if laid in ducting). Cutting to manageable lengths is easy with angle grinder or gas-axe.

          A company I worked for lost a lot of cable this way until suitable precautions were taken (which I won't go into).

          NN

          1. Sam not the Viking Silver badge
            Pint

            Re: Enterprising thieves

            As a pure bystander, I was beginning to think you had more expertise in this activity than was truly necessary....... until I read the last line. I think.....

          2. Anonymous South African Coward Silver badge

            Re: Enterprising thieves

            They have perfected the (f)art of cable theft here in South Africa.

        3. TDog

          Re: Enterprising thieves

          I was expecting "Someone stole the trench".

      4. Gene Cash Silver badge

        I had to borrow fuel once from a shed. It was DEEP in the west of bumfuck and I didn't have much choice. I did leave a $20 for probably $1.50 worth of fuel at the time, and a "sorry. thank you" note. I took the minimum I needed.

        1. John Brown (no body) Silver badge

          Must've been a long time ago. Dunno about current US prices, but with the current exchange rate, $1.50 might get you a few miles in a car these days in the UK, it'd get you a pint or two at most.

          1. jake Silver badge

            "Dunno about current US prices"

            Too high. I have seen $7.499/US gallon.

            Fortunately I had my bulk tanks filled about three weeks ago. Hopefully we'll have enough to last the current bit of stupidity.

        2. jake Silver badge

          I have done similar ... Bust a hole in the fuel tank of my truck after kicking up a branch in the wilds between Mono Lake and Yosemite. Climbed a hill to get my bearings (and hopefully a cell signal), and discovered a ranch off in the distance. Hiked about 8 miles with no trail to speak of, only to discover it was an unoccupied off-grid line-camp. Poked around a bit, and discovered a 5 gallon can of gas about half full ... I left a short note and the hundred dollar bill I always carry for emergencies, and carried it back to my rig. Not fun, but after strapping it under the hood, with a length of fuel line running to the pump, I made it back to civilization.

          The following weekend, having located the line-camp on a map, I returned with a now full gas can, and a second as a "thank you". Nobody had collected the hundred as of yet ... so I added to the note, and left both gas cans and the money.

    3. Anonymous South African Coward Silver badge

      Haha, the business park my work resides in also had the diesel theft issue - some ne'er-do-wells (employers from other companies) would come in over a weekend (or after hours) to liberate some diesel that belongs to other companies from trucks...

      ...they got caught out red-handed after it went on for quite a while.

      Things are quiet for now, no more diesel theft.

  9. Jaspa

    Diesel is all well and good ....

    .... until your genny explodes mid use.

    Back in the day we experienced a power outage while on a cig break. The clue was in the lights going out and the genny exhausts (about 3ft behind us) bellowing out fumes.

    By the time we walked back in , all was well. Servers running, Management suitably impressed.

    Panic over, latte/cig and smug grin time, then BANG!

    What appeared to be a large cap had blown, taking off the genny door and the passenger window of the car parked opposite.

    1. MiguelC Silver badge

      Re: Diesel is all well and good ....

      I remember in the late 90's a DR test that was supposed to go Mains -> Battery -> Generator

      But instead went Mains -> Battery -> Extremely loud bang in Generator -> fire department called and full building evacuation (and of some adjacent buildings, being in an old part of town) -> 2 days to go back online

      Fun times

  10. Red Sceptic
    Pint

    Practise [sic] makes …

    Usually see “practice” used as a verb (accepted use across the pond, of course).

    Don’t ever recall seeing “practise” used as a noun. Another Register first?

    Have one of these ->

    1. the Jim bloke Silver badge
      Headmaster

      Re: Practise [sic] makes …

      A Practice is the term for a doctors business, or turf/territory/whatever.

      Presumably if they keep doing it long enough, they'll get good at it...

  11. UCAP Silver badge
    FAIL

    I was told this story some while ago, it may be an urban myth but there are elements that sadly ring true.

    I major data centre in the US of A went all out for redundancy. Dual incoming power supplies, with diesel generators for backup, and UPS capable of handling the full load for about 20 minutes just in case the generators were slow to spool up. The generators were tested each week and (shock, horror) the fuel tanks checked and kept topped up. The one problem is that the fuel tanks had to be located about 100 yards away for fire safety reasons, but this was solved by installing fuel pumps that kept the generators fed very well.

    Of course the inevitable happened any one day both power supplies failed (the story discusses the rank incompetence of the power company in great length using may pointed words, but I digress). The UPS initially kicked in as planned, then the generators started up and took the full load. About 30 seconds later they spluttered and died, with the rest of data centre following once the USP batteries had drained. It turned out that someone, in a fit of unbelievable genius, had wired the fully pumps so that they could only get power from the mains supply.

    I'll leave the resulting management reaction to your imaginations.

    1. SImon Hobson Silver badge

      Not the first, and won't be the last time.

      The gennys should really have had a local fuel tank to give them a short autonomous running period - with the bulk tanks and pumps only used to top up the local tank. There was case not that long ago when (from memory) a hurricane devastated part of the USA. A hosting outfit there kept online thanks to their own genny - but they had the same problem, the bulk tanks were in teh basement and the lift pump needed the mains. They battled on for some time carrying fuel up the stairs by hand, but soon had to admit defeat.

      My late father also used to tell a tale of how we had a nationwide blackout in the 40s. I don't know what triggered it, but the whole country went dark with just about all the power plants (mostly coal in those days) having shut down. They then found that they couldn't start up most of the plants because they had been designed on the assumption that "there'll alway be the grid supply" - so no power to run all the stuff needed to bring the plant up to operating state. It needed some careful sequenced operations to bring the grid up in stages to get power to all these plants to allow them to get started up. They (I think this was in the days of National Grid) then had a program to retrofit small gas-turbines to many fo the plants to provide a black-start capability, with a knock-on benefit of them being good for peak lopping.

      1. Ian Johnston Silver badge

        My late father also used to tell a tale of how we had a nationwide blackout in the 40s. I don't know what triggered it, but the whole country went dark with just about all the power plants (mostly coal in those days) having shut down.

        A collegue of mine who worked for the CEGB at leatherhead told me that the National Grid has never gone down since it was started in the 1930s, and that a good deal of thought and planning has gone it to working out how to restart it if the need ever arises.

        1. MJB7

          Black start

          I believe the plan is to use Dinorwig and various hydroelectric systems to start the grid, and then bring up other generators in sequence.

          1. Anonymous Coward
            Anonymous Coward

            Re: Black start

            The black start procedure plan is fairly complex to explain in full... But broadly it's something like this:

            1: start up substation backup generators (diesels, batteries) and restore communications to HQ

            2: Fire up coal or gas generators to create a 50HZ base, very limited demand kept online to ensure stability. Form small power islands.

            3: phased demand restoration in conjunction with DNO, firing up additional generators as demand rises

            4: synchronization of power islands and begin meshing back to one grid

            5: re-establish the nukes, and drip feed in the windmills phasing back gas demand

            The real plan would have to vary in response to causes of course and that's where the operators skills and years of training really come in.

            Dino is a useful tool, but in a black start it's something of a one shot wonder, because once used you've got nothing.

            A full blown black start hasn't been done in the UK, ever. But there have occasionally been isolated legs.

            Some elements of black start plans need checking periodically. We caught out Torness whose plan was to form a power island with the generators at Peterhead. Even after the latter were decommissioned... It was with much delight we dropped that message on the EDF board meeting. (They now keep tankers full of diesel on standby for that purpose).

            1. Justthefacts Silver badge

              Re: Black start

              Does the black start procedure depend quite fundamentally on having still at least a large minority of fossil fuel generation? As I understand you, the nukes need a lot of power to start, and the windmills can’t provide a stable 50Hz? So the renewables transition…..

              1. Anonymous Coward
                Anonymous Coward

                Re: Black start

                Yes, it does. Which is why Ratcliffe and others are contracted for Black Start services.

                The only way to go fully renewable on black start would be a massive expansion of storage capability.

          2. redpawn

            Re: Black start

            On Oahu in Hawaii in the eighties we had an island wide outage which resulted when the backup lines for the largest power plant were down for maintenance and a sugarcane fire caused the mains to arc shutting down the highest capacity power lines. The resulting loss of power caused a cascade of smaller power plant failures as they were not able to isolate themselves from the load and shut down shortly after. It took over a day to get the power up and Hawaiian Electric refused a navy offer to kick start the grid with power from a nuclear submarine. A similar cascading outage happened a couple of years later when a rat chewed up cables under a street in downtown Honolulu.

            1. John Brown (no body) Silver badge

              Re: Black start

              "It took over a day to get the power up and Hawaiian Electric refused a navy offer to kick start the grid with power from a nuclear submarine."

              Is that a viable solution without special considerations? I've heard it suggested before and I'm fairly sure I remember someone posting that it's "not the right type of electric" or something.

              1. Anonymous Coward
                Anonymous Coward

                Re: Black start

                Posting anon because ...

                It depends where you are. If you are in a 50Hz country then no, it's the wrong sort of lecky.

                But if you're in a 60Hz country - and the Americas are - then it would be OK. You'd need the right cables and connectors which are not carried on board (so would have to be shipped out), and somewhere to connect into with the right transformer voltage (and ability to backfeed without tripping breakers). But in principle, yes you could take power from a submarine, or another warship, or a civvy vessel and use that to get the lights on.

                How much power is available after allowing for their own internal use would depend on the vessel. I have an idea of what that might be for a submarine, but I can't say. It would be enough for that purpose.

                But taking the requirements (see above) into account, it could be more trouble than it's worth, and take too long, to be a useful intervention.

        2. Sam not the Viking Silver badge

          My experience with major power stations in the UK is that they invariably had 'Black Start' generators to 'get things going'. Watching a 70 MW gas turbine start-up and go onto full-load in under 20 seconds was impressively under-whelming! The only way you could tell it was running was by the exhaust.

          1. SImon Hobson Silver badge

            Yes, they did after the 40s blackout, and I would imagine most new stations would have included it in their design (unless they were perhaps twinned with an existing station that had a black start capacity).

      2. Andy A Bronze badge
        Flame

        <The gennys should really have had a local fuel tank to give them a short autonomous running period - with the bulk tanks and pumps only used to top up the local tank.>

        I worked with a chap who had previously worked at a mainframe site with such a setup.

        One weekend (it was not 24/7) the level switch in the header tank failed. The pump carefully emptied the hundreds of gallons of reserves into the 5-gallon tank.

        In the morning the actual floor of the computer room (carefully waterproofed) was flooded with diesel, The smoke sensors under the floor tiles actually floated!

        Luckily diesel is not very flammable, otherwise see icon.

        1. SImon Hobson Silver badge

          That was a poor design then.

          If it was so reliant on a level switch then there should have been some redundancy. But a simpler method (depending on physical arrangement) is to arrange for the overflow to go back to the main tank - that way, you just end up circulating the fuel around.

      3. Timo

        cold start generators

        Took a tour once of the hydroelectric dam (Bagnell Dam) and power station at the Lake of the Ozarks.

        The hydro plant had a set of smaller turbines that can self-generate enough electricity to power the plant and bring it online. It may provide synchronization for the rest of the larger generators until it can connected back with the grid. I had just finished a power engineering class in college and it blew my mind that I actually understood what was going on.

        1. Imhotep Silver badge

          Re: cold start generators

          The same utility built a dam atop the highest point in Missouri. During the night, power powered the turbines at the bottom to pump water from the river at the mountain's base up a shaft through the mountain and in to the dam.

          During the day, water was allowed to flow back through the shaft to spin the turbines and generate electricity.

          Basically, a giant wet cell battery.

          1. Precordial thump

            Re: cold start generators

            Giant? Australia's Snowy Hydro 2 (same principle, under construction) will have 2GW capacity.

            1. Antron Argaiv Silver badge
              Thumb Up

              Re: cold start generators

              As a kiddo in the 60s, I went on a fairly complete tour of the Snowy scheme, then under construction. Including underground. No hard hats, hivis or glasses, but Mum was with me. I think I was about 9. I do remember being in one of the giant water tunnels and hoping the valves were shut.

              BTW, one of the biggest pumped storage facilities in the US is just up the road from where I went to uni:

              https://www.wbur.org/news/2016/12/02/northfield-mountain-hydroelectric-station

            2. julian.smith
              FAIL

              Snowy 2

              Will it be operational before the submarines arrive in 2040+?

              It has the same project manager - Scotty from Hillsong

            3. Anonymous Coward
              Anonymous Coward

              Re: cold start generators

              Giant? Australia's Snowy Hydro 2 (same principle, under construction) will have 2GW capacity.

              "You call that a knife? THIS is a knife!" Aussies....

              1. MCPicoli

                Re: cold start generators

                Itaipu UHE in Brazil just entered the chat. (~11 GW)

                (trying not to draw attention from 3 Gorges Dam people around...)

      4. Anonymous Coward
        Anonymous Coward

        RE: black-start site

        The kids 5th and 6th grade classes had a tour of the new local peak power plant. My kid talked me into going as a chaperon.

        It was designed to also be a black start site. So it had a room of car sized diesel generators to get the control room and bigger turbines going if there was no outside power. The plant normally ran off of Natural gas as there were two large gas pipelines running near by. But it also had several large tanks of diesel to hold it over if the gas pipelines were incapacitated.

        I was impressed. The kids were more impressed with the live cat they found in one of the equipment crates when they were building the place.

      5. stiine Silver badge
        Black Helicopters

        If it was the 40's, its possible that the Luftwaffe had something to do with the outgates.

        Obvs, not helicopters but you get the idea.

  12. Barking House
    FAIL

    UPS Failure ....

    Worked for an a managed service provider and had a suitably impressive DC with lots of redundancy, all top notch, including diesel generator that was managed by the local power grid provider (Who used it to smooth out power supplies when demand was high). All great stuff.

    However the UPS batteries as you will be aware do need some tender care and attention to keep them in great condition, this was done on a regular basis but unknown to everyone is that we have a defective battery (That past inspection) and during peak daytime running suddenly failed and produced an impressive amount of smoke, this triggered the fire system which duly shutdown the DC. Had a bit of fun trying to stop the local fire brigade from axing their way through the doors, we managed to show them the doors were unlocked (As part of the fire system released all locked doors when activated).

    The damage was limited to the UPS battery and some associated kit, nothing dramatic, but then the fun started with explaining why the UPS/Generator et al did not keep the DC up - It would seem obvious that in the event of a fire that the DC power would be shutdown, but this seemed to be a surprise to senior leadership and a number of the customers .......

  13. Roger Greenwood

    Only one generator?

    Amateurs...

    The other common failure is a flat battery, see/heard about this so many times from those who don't test them regularly (you know, the not very important places like hospitals).

    1. firu toddo

      Re: Only one generator?

      LOL, we had two UPS's and servers with twin PSUs

      And several racks powered exclusively from one UPS.

      And that UPS had a defective battery. Can you say recover SQL cluster?

      1. Norman Nescio Silver badge

        Re: Only one generator?

        At least powering several racks from a single UPS was intentional, if unwise.

        At a client site there were the usual two UPS's and racks had dual PDUs so you could power equipment with more than one power-supply from different protected circuits. All fine and dandy. Until one day some maintenance was required on the main panel, which meant powering down one of the feeds. This should not cause a problem, everything was redundantly powered, right?

        Electrician trips the breaker and a row of racks powers down.

        It turned out that at the data-centre build someone had accidentally wired both the A and B feeds into a single UPS feed, rather than A to one and B to the other, just for this row of racks.

        The client was understandably unhappy with the electrical contractor responsible for the initial build.

        1. John Brown (no body) Silver badge

          Re: Only one generator?

          "The client was understandably unhappy with the electrical contractor responsible for the initial build."

          Yes, but probably after balling out the electrician who turned it off in the first place. After all, he pulled the switch, and surely he should have know, that's his job, innit? </sarc> Hopefully they didn't fire him before the investigation found the root cause.

        2. David Hicklin

          Re: Only one generator?

          "At a client site there were the usual two UPS's and racks had dual PDUs so you could power equipment with more than one power-supply from different protected circuits. "

          Or you find out that when powered from just one feed it exceeds the supply capacity and trips out

    2. Anonymous Coward
      Anonymous Coward

      Re: Only one generator?

      I had a client genset onsite a long time ago, which had been there for a long time. Much of the site had been updated, and various modifications had been made to the genset control panel. But everything was fine, as the genset was tested on a regular basis. It was even tested online occasionally. By pressing the start button. Then pressing the main changeover switch.

      It doesn't take a genius to understand that the control system for the genset needs to run from the genset 24V DC battery. And that the battery needs to be constantly float charged. So why some electrician, had, at some previous point, added a couple of extra 240v AC relays into the panel was a mystery Relays that obviously had power when the genset was tested and all was fine. Relays that, when the site mains failed, just sat there refusing to budge and were unable to star the genset......

      1. The Oncoming Scorn Silver badge
        Black Helicopters

        Re: Only one generator?

        This may be two separate stories related to me....but merged into one thanks to grey cells degradation & years since told to me.

        UPS battery feeding a Air Traffic Control Tower (Icon) in the military somewhere, Top brass arrives, inspects & demands a battle readiness test demands a "full outage" to test UPS with the fate of the free world held in the balance. Said battery backup fails instantly.

        Angry top brass demands a full report into said failure within 1 day, which was delivered by independent personal & which revealed that not only had the Master sergeant in charge of the battery refilling & signed off for as part of his daily duties, had merely signed off the sheet & skipped the whole arduous eyeball & top-up & the whole thing was full of pastry flakes & traces of tea\coffee as it was very nice for slow (re) heating of cold pasties, pies & keeping beverages warm.

        Said sergeant one fast court martial later, found himself rapidly busted down to corporal.

    3. Antron Argaiv Silver badge
      Thumb Up

      Re: Only one generator?

      At home, I put a label on my UPS with the install date for the batteries. Right up front where it's clearly visible.

      I find I get 4 or 5 years out of a set of batteries.

  14. Anonymous Coward
    Anonymous Coward

    Single point of multiple failures.

    I was a member of the ops team in a large multi site organisation. The primary site had redundant data links to the other sites. Two shiny new high capacity links provided by a large telco famous for many, many other things too....

    I was called in one day to check the cross site links. They were all down. Famous telco swore blind their network was fine, just not getting data from us.

    So it must be a problem at our end ;-( We're up, the links are up, just not getting a response from them on either line.

    The two independent links left our site, by two different routes, and disappeared into the famous telco's network. Both went into the same famous telco's building, in the same room....

    Turned out it wasn't a problem at our end ;-)

    1. SImon Hobson Silver badge

      Re: Single point of multiple failures.

      Getting truly diverse routing is very hard, and it's been getting progressively harder. Once it might have been sufficient to have your lines routed to two different exchanges, but these days, you find that most exchanges are simply remote line cards to one central exchange. And you can't even use different providers since they mostly rent stuff from each other.

      Even the people who plan and manage 999 call centres have fallen for this in the past.

      1. Arthur the cat Silver badge

        Re: Single point of multiple failures.

        These days you probably want two (ideally but probably not) independent fibres, a mobile link and a satellite link for backups and a pigeon loft for RFC 1149 backup backup.

      2. Allan George Dyer Silver badge
        Facepalm

        Re: Single point of multiple failures.

        @SImon Hobson - Something like this?

        1. teknopaul Silver badge

          Re: Single point of multiple failures.

          AWS is getting close to that.

      3. midgepad

        Requires

        actual walking round, going and seeing,

        Management function.

        Not always successfully done in NHS some years ago.

  15. Lazlo Woodbine Silver badge

    Many years ago I worked in a large department store that had a frozen food concession on the ground floor with a huge walk-in freezer in the basement,

    The shopping centre gave us advanced notice that they were cutting the power in a week for essential work, so I was sent up to the roof to check the big Cummins generator.

    It fired up OK, but upon dipping the tank I found it was all but empty, so we ordered a barrel of diesel which was duly delivered the next day and connected to the pipe in the loading bay to be pumped 4 floors up to the roof, except it's that long since we'd used the pump it had seized up.

    We put the barrel in the goods lift, took it up to the top floor, then had to carefully drag the very heavy (about 200 kilos) up the last flight of stairs to the roof, where yours truly had the wonderful task of siphoning the fuel from barrel to tank.

    Genny tested again, all good.

    A week later, the shopping centre gives us the good news that they could carry out the work without cutting our power...

    1. Martin-73 Silver badge
      Pint

      Upvote for the last sentence.... typical But at least it's up there for next time ;)

      Icon coz i expect you needed a few

  16. Mark #255

    Risks of testing your systems

    I was told this by a risk assessment specialist (on a project that was building railway stations).

    Fire alarm systems get tested periodically, but if there's an integrated sprinkler system, you obviously don't want to actually cover the building in water. So, there's a diverter valve (or several), which can be set to channel the water to a test outlet, enabling a test of the active parts of the alarm-plus-sprinklers.

    And the interesting risk here is that there's a non-zero chance that after the test, someone forgets to re-divert the output back to the actual sprinklers!

    1. lostsomehwere

      Re: Risks of testing your systems

      I have one of those in one of my premises, the testing is controlled by a panel that automatically resets to a safe state after use.

      1. Antron Argaiv Silver badge
        Unhappy

        Re: Risks of testing your systems

        ...MOST of the time.

        :-)

        Murphy cannot be fooled.

  17. big_D Silver badge

    Blissful ignorance

    I worked at a site, where the IT on the desks (computer and monitor), plus the telephone system were supplied on a dedicated circuit with UPS and generator in the basement. Lighting and the normal electrical devices were powered on a normal circuit and not backed up with the generator. The whole thing was supposed to power the server room and desk-side IT for around 30 minutes, time for everyone to save their work and cleanly power everything down.

    That was the theory...

    We (3 consultants) were sitting in the office, when suddenly all our PCs suddenly lost power. Cursing and swearing ensued and we tried the lights, nothing. Then we went into the corridor, nothing, silence...

    We found the finance manager wandering the hall and asked, "wasn't the generator supposed to kick in? Our PCs just switched off without warning!"

    "Oh, the power has been off for over 30 minutes. Didn't you notice the lights going out?"

    It was the middle of summer, so we didn't have any lights turned on, we were sitting in a sun-filled office, with the windows open, so we just didn't notice that the power went out and the IT power delivery had no warning system to alert users that they were running on backup power...

    1. SImon Hobson Silver badge

      Re: Blissful ignorance

      We were on the receiving end of that with a large (very large, international, tier 1) comms provider.

      There were were sat at our desks one morning, and our internet went off. Did the usual checks, confirmed it wasn't our end, called the vendor's support desk. By the end of the call, the guy said (something like) "there must be something serious kicking off, I've never seen the call queue hit 40 before". Shortly afterwards, our internet came back on, and then we could see that a major customer (science park, some people paid more per hour than we get a week or even month) was offline. At this point we couldn't get to the support desk as their lines were all engaged. Eventually we did get through, to be told there was a major outage in a London network centre and it was going to be some time. It was the early hours of the following morning when the site came back online.

      To be fair, the provider were quite open about the failings. Their network centre had all the dual supplies, battery backup, etc, etc. One TRU (transformer rectifier unit) had tripped it's AC breaker - but they had no monitoring to tell them, eventually, the other TRU failed but they had no monitoring to tell them. They were now running on battery power, but their monitoring didn't tell them the batteries were getting low. Then the batteries ran out and the site went dark. They had a genny on site - but it didn't fire up as both AC supplies were showing as OK.

      Techie dispatched, switched on the tripped breaker, the OK TRU powered up, and the equipment came back on - that's when our internet came back. But, one of the routers (the "rack sized, gazzillion fibre ports, each carrying many virtual circuits" type) didn't come back up. Multiple hardware failures, plus they found they didn't have a usable config backup so admins had to sit and manually reprovision all the circuits by hand - so their customers were coming back online one by one as this happened.

      Their report on the incident stated that "monitoring was to be improved".

      1. John Brown (no body) Silver badge

        Re: Blissful ignorance

        "but they had no monitoring to tell them, eventually, the other TRU failed but they had no monitoring to tell them."

        On a much smaller scale, I was sent off to a data centre to be "a pair of hands" to replace a failed HDD in a RAID for a world wide online payments processor. While chatting with their desk-based support op who was triggering the HDD LED to make sure I pulled the correct drive, I mentioned that only one LED was lit of the two for the redundant PSUs. He was a bit surprised as it was was supposed to be monitored, but clearly was not. I went back a day later to meet the overnighted PSU and swap that out too. Apparently they were quite impressed that I spotted the issue and reported it. I guess they had used companies with less experienced people to be "a pair of hands" previously.

        1. A Nother Handle

          Pair of hands

          Touch, but don't look.

  18. Sequin

    I worked at a site that was the maintenance centre for emergency services comms equipment. We were installing a stock control system running on a Prime mini and they insisted that they could do all of the cabling themselves, being electrical and electronics engineers. They partitioned off a corner of the main office and installed the power supplies and the UPS system that we had specified, which was sufficient to run for about 30 minutes to allow a controlled shutdown.

    One day contractors working on the road outside put a digger through the power cables. Everything on site went down like the Titanic - including everything in the server room.

    It turns out that they had installed circuit breakers ("for safety") that sensed the incoming supply - but they installed them between the UPS and the server, meaning that when the supply went down there was no connection between the UPS and the server!

  19. Terry 6 Silver badge

    This is well beyond my knowledge and experience but....

    ..the common theme isn't.

    Planning, and that includes disaster planning, is actually a form of modelling.

    Which means that you don't just plan for a specific point or problem.

    You systematically work through every link in the chain, including the known risks and remediations, then examine each one for process, points of failure and consequences .

    What could happen- -what could stop it happening-what needs to happen to avoid this-what would the results of it happening be. And that includes the consequences of the precaution. Even little things like, say a fire in a waste bin. It's all very well making sure that the alarms etc. will be working, doing the fire marshal thing, evacuating, calling the fire brigade, using the right fire extinguisher if safe to do so, but also remember to get the extinguisher replaced afterwards if you do use them.

    In my case would not just have been [vacate school] but also once we've done this where we store the kids, how we contact the parents, how we keep in contact with them once there is a solution etc. what the solution could be

    And yes, more than once I saw the failure to do those extra steps in process e.g what do we do if a school is out of action for several weeks and yes I did have the experience of trying to get the local authority to put plans in place for these elements.. And I've seen the failures of process because these details were ignored.

    For example, a school vacated for a gas leak, so the normal fire planning worked beautifully, leading to kids getting out of the building really quickly and sensibly, but then being kept in the playground (highly dangerous if the thing had gone boom!) while the school tried to find a safe place for them, because there was no access to the parents' phone numbers and no plan in place for emergency evacuation beyond the fire drill.

    Or two other times when a school has had to be closed at short notice and there was just nowhere for the kids to go for the several weeks required because no one had planned beyond the immediate crisis. Yes, it's a once in a decade event. But stuff happens. And the model needs to fit.

    1. Anonymous Coward
      Anonymous Coward

      Re: This is well beyond my knowledge and experience but....

      Ah yes, the whole business continuity thing.

      <cough> decades ago, I got casually told by the FD "write disaster recovery plans for the IT" - because it had been flagged up during an audit (insurance, auditors, parent company - I lost track of who was telling us to do what). So I'm thinking along the lines of "well I've got some ideas" and asked in a mailing list for some pointers. Timing was fortuitous, as someone on the list had a course coming up, it was short of people to make some of the gaming elements work well, so I could come at cost as long as I didn't tell anyone how much we weren't paying ;-)

      It was ... very informative.

      When I got back, I was able to ask the director what the BC plan said our recovery objectives were - and then we could plan to meet them. BTW - did I mention this was a "zero budget" project ? "Don't be awkward, just get on with it" was all the help we got. So we did what we could in terms of documenting our existing processes etc.

      A short time later we had a fire alarm and everyone was stood outside in the rain. I'd grabbed my jacket off the back of my chair as we left, so had my car keys - and the several brollies I had (amazing how you collect this sort of junk) were immediately put to use. It was a false alarm, but as the light was fading I casually sidled up to another director and casually asked along the lines of "if this was a real fire, what would you do with all these people stood outside getting cold in the rain ?" Before mentioning that the BC plan would probably have involved the village hall 1/2 mile away, who were the keyholders for it ? How would they get people home, how would they get into their house without their keys, how would they tell them when to come back to work, ... "Humph" was the response I got to that. No, we had no plan, no emergency contact list, nothing at all because "it's an IT problem" apparently.

      Well, I tried !

      1. Terry 6 Silver badge

        Re: This is well beyond my knowledge and experience but....

        It's amazing and quite saddening how many people think that if you cover the hole it goes away.

        Until someone steps on it.....

      2. Korev Silver badge
        Flame

        Re: This is well beyond my knowledge and experience but....

        I used to work a place where one of the labs went more exothermic than desired... Some people had to change into special clothing to perform their role.

        Everyone got out quickly and safely and then over ten fire engines showed up and the fire was soon out. People were waiting in the canteen and but they weren't allowed back in until they'd proven it was safe (which ended up taking days). Those people in the specialist gear didn't have their keys with them which meant they couldn't get home. In the end I think the site head ended up going to the bank and giving people money.

        One other thing that wasn't planned for was that all the senior people who were handling the mess were making a huge number of calls on their mobile phones which soon ran low on battery. Round about this time the managers went from Nokia bricks that lasted over a week to smart phones; no one had thought to add phone chargers to the room used to coordinate emergency responses.

      3. Binraider Silver badge

        Re: This is well beyond my knowledge and experience but....

        Thankfully, in my experience writing BC plans in my last two employers did go slightly beyond the check box compliance that you've been subjected to.

        It is not exactly difficult to understand concepts of insurance, risk and reward. Directors that don't get that and regard it as zero budget/not their problem don't understand their own job (at least no further than getting bonuses as long as things are working).

        Facilities for DR; comms, backup computers etc need budget; procedures for transferring personnel or bringing in other personnel needed. Data transfer / restoration from backups. But I am preaching to the converted here.

        Who grades directors on performance and on what criteria?

    2. Down not across Silver badge

      Re: This is well beyond my knowledge and experience but....

      You mentioned a funny little thing the so often gets not thought about.

      And that includes the consequences of the precaution.

    3. Andrew Beardsley

      Re: This is well beyond my knowledge and experience but....

      My contribution to this was to point out that the DR/BC plans were all stored on a network share which would probably be unavailable if needed. Solution was that the BC team were told to periodically replicate onto their laptops. No idea if they really did this as I was not involved in the testing.

      One of the last remaining clerical admins had the job or printing out 6 sets of the most vital BC information quarterly and distributing to the team tasked with BC management. Pretty sure that nobody took on that task when she retired. Left that job many years ago, so not my problem.

      1. Terry 6 Silver badge

        Re: This is well beyond my knowledge and experience but....

        Ah yes. I used to have a BC plan on two sheets of A4. With copies, some on various electronic media, including ones I could access remotely.

        But the council decided to insist on a central BC database. So f****ing complicated that it was impossible to navigate to set up, let alone get anything useful out of. And of course the only useful bit was the contents of my two A4 pages. A system bought in a great expense, clearly designed to be operated by an administrative department- not a bunch of front line professionals with 2 hours training and who already have a full time job. The key point about it was that all the bits of information were fragmented. Think Christmas tree with different bits of information on each branch, some branches having no role, some having a role that didn't seem to mean anything and and some requiring information that was non-essential. None seeming to relate to or connect to anything adjacent. All navigated by a raft of non-standard icons- some of which may or may not have done something important, but God knows what..

  20. Diodelogic

    Flashlight

    One day at work, someone digging in a nearby field cut the main power cable to our building. Everything went dark, then the emergency lights came on. The two enormous generators in the back of the building failed to start for some reason and our facility was dead. We were told to just relax and take it easy while the power was being restored. Whilst waiting, I dug out my flashlight from my briefcase and took it with me to the men's room, as I remembered there were no emergency lights there. I was right, but I had no trouble, ah, taking care of business. Turned out the men's room was pretty well occupied at the time, including a lot of guys who used to wonder why on earth I had a flashlight in my briefcase. When I tried to leave there were yells of "HEY! WHERE ARE YOU GOING?!" from the stalls.

    Never got questioned about the flashlight again...

    1. SImon Hobson Silver badge

      Re: Flashlight

      At a previous job, the lights in there were on an occupancy sensor. I suspect most of you can fill in the rest ! Think, light traffic, taking your time reading the paper, lights go out because you aren't in the sensor's field of view. I wasn't the paper reading type, but apparently it came down to a choice between waiting for someone to come in and trigger the lights, or open the door, wave you arm about and hope you're still alone.

      1. Andrew Beardsley
        Facepalm

        Re: Flashlight

        I had a similar experience when doing some cabling in a server room. Crouched down behind a rack with several floor tiles lifted and was out of sight of the sensor. After a fairly short delay, all the room lights go out and I am stuck with trying to get to a point to trigger the sensor without falling down any of the holes from the lifted tiles.

        After I complained about that, an override switch was added to the lights.

        1. jake Silver badge

          Re: Flashlight

          I always carry a AAA-powered single-cell Mag-lite. Little bits of good light are more useful more often than you might think.

          Yes, I know, I should swap it for a more modern rechargable light with LED bulb ... trouble is I can't find one that is as easy to carry as what I have now AND seems to be built for the long-haul like the 30+ year old unit in my pocket.

          I'm open to suggestions.

          1. teknopaul Silver badge

            Re: Flashlight

            Use a phone.

            It ain't built for the long haul but you have many reasons to replace it regularly, keep the battery charged up, have it on you, and better indicator of how much time you have left.

            Phone torches are efficient and batteries are top notch.

            It's surprising how much more convenient flat sides are compared to a barrel shape, if you want to prop the light up somewhere, e.g top of a rack.

            1. jake Silver badge

              Re: Flashlight

              0) The phone's light isn't focusable.

              1) The phone's light points in the wrong direction.

              2) The phone doesn't fit in my mouth.

              3) I do not now, never have, and never will, carry a so-called "smart phone".

              If you're worried about it rolling, any number of typical office accoutrements can stop it ... a rubber band, a postit, a bit of tape, a paperclip, a twist-tie, a penny ... In the last thirty years I've never found this to be a problem, though. See "mouth", above.

              1. Charles 9 Silver badge

                Re: Flashlight

                If you really need your hands free while using a light, try a head lamp. There are focusable varieties out there, and most have flat surfaces.

                And pretty soon, it's gonna be smart phone or bust. When that happens, are you gonna go, "Stop the world! I wanna get off!"?

    2. Richard Pennington 1

      Re: Flashlight

      Many years ago (early 1990s), I worked in a small firm in Cambridge (small enough that generators were not an option). One evening, all the lights and power went off., and stayed off.

      I had done a PhD in astronomy, one of the benefits of which is that I know how to move around safely in the dark. When I got to the front door I observed that the street lights were off - there was a power cut to a significant portion of the city. I made my way back to the working area and let the team know that the problem was outside, not inside.

      1. Andy A Bronze badge
        Thumb Up

        Re: Flashlight

        Some time ago I was at a railway depot, clearing off some overdue tickets which the local staff had trouble with, when the room suddenly went dark.

        Enquiry resulted in information that someone with a digger had severed the Big Cable, so no chance of clearing the jobs that afternoon,

        I shut down the local server before its little UPS died, grabbed my things and headed over to the carriage shed to explain that I was departing.

        I commiserated with them, knowing that they would be unable to brew up.

        "No problem," they said. "This buffet car gets its power from the overheads. As much hot water as we like!"

    3. Korev Silver badge
      Thumb Up

      Re: Flashlight

      including a lot of guys who used to wonder why on earth I had a flashlight in my briefcase.

      This is why I keep a torch (and my glasses) on my bedside table.

      1. msobkow Silver badge

        Re: Flashlight

        I hate to admit it, but I find the flashlight mode of my cell phone very handy when I'm on the balcony and need a light in the evening... it isn't a _great_ flashlight, but it gets the job done for finding things you dropped on the table or floor.

  21. Anonymous Coward
    Anonymous Coward

    Private discussion... nudge nudge

    > "we had a couple of people who'd decided the lift was a good place to hold 'a private discussion'," explained Philip

    Fnarr!

  22. Boris the Cockroach Silver badge
    Happy

    Pretty

    much guessed 'no fuel in generator' or 'UPS battery being a smoking piece of wreckage' before reaching the end ...

    Along with various C-level twonks yelling "Its been tested every week.. blame IT "

    <<looks at the ever growing pile of dead laptops... and praises his backup policy........<panicky feeling start kicking in> <twitch><twitch>

  23. Norman Nescio Silver badge

    Tha barrel of diesel and the Alfasud

    Time for me to link to a previous war story. Probably relevant.

    https://forums.theregister.com/forum/all/2018/04/05/this_damn_war_power_out/#c_3477552

  24. frabbledeklatter

    Plenty of Oil, But ...

    Huge generator along one side of the building had not been tested in quite a while. Someone rightly decided that it needed testing, which they initiated late one morning. The engine had been sitting so long that it started with an immense belch of black smoke, behind which the four-story building disappeared for a minute or so.. What little breeze there was wafted the smoke into the bulding's air handling equipment. Fire alarms went off throughout the building. Everyone got out to the parking lot safely, to wait for the Fire Department to show up, ventilate, reset the alarms, and declare the building safe three hours later.

    It's one thing to show up at home in the evening with a faint smell of perfume and a bit of lipstick on the collar. Coming home reeking of diesel fuel and a bit sooty is quite another.

  25. Tom 7 Silver badge

    and it all failed because some oik forgot to buy a few pounds worth of red diesel

    which will have been down to the fact someone somewhere hadnt added it to the list of sign-offable checks. I am fairly certain the oik was not really at fault.

  26. Will Godfrey Silver badge
    Happy

    Just a small example

    I was working for a smallish electronics company, in an out-of-town setting. Everything was on one floor in a pretty much open-plan design. There was a power failure, and without thinking I just reached into my toolbox and got my torch, so did the other guys. The manager's door opened and observing the scattered pools of light an amused voice came out with "You can see who are the engineers".

  27. Shez

    Does Diesel not go off?

    I know petrol does, well not necessarily go off, but it degrades over time. I would have expected Diesel to as well, necessitating a complete replacement (ideally before it goes off so it can be resold while still good) of generator fuel supplies periodically.

    Edit... just googled it and yes it does - apparently it goes gummy, however with the right additives and storage can be kept for around 12 months.

    1. Andy A Bronze badge

      Re: Does Diesel not go off?

      Current diesel comes with a compulsory bio- component.

      This encourages the growth of certain algae, which gum things up much faster.

      1. Anonymous Coward
        Anonymous Coward

        Re: Does Diesel not go off?

        Yes, this is a bit of a bugger when you have diesels that you only want to fire up once a year to prove they work... And in the meantime the fuel rots away.

        Of course, being red diesel you can't just ferry off the fuel to use it on something else very easily.

        Seriously looking at H2 fuel cells, even Hydrogen turbines. Minor concern that one needs to source Green hydrogen or buy / pool an Electrolyser.

        Red diesel tax breaks are of course being reduced in near future too, so the number of reasons for looking at alternatives are rising rapidly.

        A/C, but for those interested our outfit are issuing a call for innovation later today via the ENA on the subject.

        1. jake Silver badge

          Re: Does Diesel not go off?

          So run it on natural gas, with propane as a worst-case scenario backup. Propane stores virtually indefinitely, and most generators make switching between the two fairly painless. (Same for most household appliances, BTW ... switch gas source and jetting and Bob's yer Auntie.)

          No space in your yard for a large propane tank? Look into getting it buried.

          1. Anonymous Coward
            Anonymous Coward

            Re: Does Diesel not go off?

            Not compatible with net zero objective. Doing stuff to achieve that earns money. Ergo Net Zero hardware preferred.

  28. Zakspade

    Seeing so many Comments on the end of this 'On Call' piece makes me realise that there is probably something wrong at my end. It wasn't linked from the On Call page as usual - so given you guys Commented, it must just be something not refreshing at my end.

    Pretty dumb not checking the fuel level though...

  29. Bogbody

    UPS ...

    Ah yes, UPS - worked well ...... telesales :-)

    Except ......

    Make shure its connected to the same phase as the mainframe - when power fails all 3 phases dont always go down at the same time.

    The pc's "upstairs" stayed on but not the database server on the ground floor.

    Whoops .....

  30. yetanotheraoc Silver badge

    A fine story but ...

    Whatever happened to the people on the lift? (The reason Philip was there in the first place.) Or will that be featured in a future Who, Me?

  31. Anonymous South African Coward Silver badge

    Just got back from a site visit.

    New UPS was installed, HyperV DR site was chugging along smoothly.

    I was there to test the UPS after setting up monitoring via SNMP.

    A wise decision in such matters were to shutdown all the running VMs on HyperV, but leave the hosts running, and manually kill power to the UPS. I was definitely not in the mood to have a corrupt SQL database laughing at me should the UPS fail for some reason.

    Everything worked. UPS sang its song of its people, servers etc kept running, and things looked good.

    Flicked the circuit breaker back to on to restore juice to the UPS... which beeped after a minute.

    Noticed the CB have tripped, so I resetted it, restoring power to the UPS...

    ...which started to beep again after a minute. Yup, CB tripped again.

    I then took a look at the CB rating.

    6A

    Somehow the CB managed the load of the UPS plus servers plus cabinet without tripping, but when the batteries jumped onboard for getting charged, the max current was just enough to trigger the CB into breaking.

    Jury rigged a connection to a 20A CB for the time being until the booboo could be fixed properly.

    Luckily the UPS and batteries was big enough to keep things ticking over for long enough for us to get the jury rigged electricity supply in.

    Manglement was not amused.

    1. Will Godfrey Silver badge
      Facepalm

      A small point:

      A CB might be rated at 6A when new, but it won't hold 6A 10 years later when it's been sweating most of that time.

      1. J. Cook Silver badge

        That, and a lot of breakers really don't like being run right at their rated load...

  32. Anonymous Coward
    Anonymous Coward

    What could possibly go wrong?

    My story is along slightly different lines. Being in a new building, we are fairly well protected for a power failure in so far as our server room and remote comms rooms will stay up for about two hours according to the UPS display panel. Considering we have several servers, virtual and physical, network switches, SANS and a large number of POE devices including CCTV, wireless access points and telephones, two hours is pretty good.

    But there is a 'new-building need' to perform a black-start. During the meeting to discuss everything that would be getting tested, I asked if there would be a specialist from the UPS installers on site in the event of anything unexpected. "Oh Yes" they lied. We had decided that we wouldn't allow the batteries to run down during the building tests just in case there was a mains failure during the restart and the batteries didn't have enough reserve for us to shut everything down again. Come the day we shut down all our servers and sans just leaving the switches and POE stuff to die (we had offline backups of the configs) when the UPS drops the load. The kit was to be off for a couple of hours so we knew that was a risk but it turned out it was off for two hours longer than planned.

    After the tests were complete, they couldn't get the UPS to restart. I pondered if it perhaps did not like restarting under load but none of the contractors people seemed to pay any attention. In the end, guess what? No - the UPS does NOT like starting up under load. They had to visit every location where there was kit protected by the UPS, open up the power distribution panels and turn it off at the main breaker. Then they could restart the UPS.

    We lost some kit. Nothing vital but it did mean some rejigging of virtual servers.

    1. Norman Nescio Silver badge

      Re: What could possibly go wrong?

      After the tests were complete, they couldn't get the UPS to restart. I pondered if it perhaps did not like restarting under load but none of the contractors people seemed to pay any attention. In the end, guess what? No - the UPS does NOT like starting up under load. They had to visit every location where there was kit protected by the UPS, open up the power distribution panels and turn it off at the main breaker. Then they could restart the UPS.

      Diesel generators don't like starting under load, either. You bring them up to speed, then apply the load, ideally slowly (like a clutch on a manual gearbox in a car).

      I attended a DR test where the UPS did it's job, the generator started up, and the time came to switch in the generator. The generator promptly stalled.

      Not long after that, a much larger shiny new generator was installed. This one could power the aircon*.

      *It turns out that while people are very careful to size a UPS so that the precious IT equipment gets all the power it needs, there is a tendency to forget what happens to the waste heat. Nobody sane runs aircon off a UPS, not least because most places have sufficient 'thermal inertia' so that a brief interlude is OK while the IT equipment is running off UPS in the wait for the generator to start up. However, if someone has sized the generator solely for the UPS, your datacentre will overheat while running on the generator as there is no power for the aircon. If the aircon and UPS are fed from the same 'dirty power' feed from the generator you will have a problem.

      1. Denarius Silver badge

        Re: What could possibly go wrong?

        Back in the day, Detroit Diesel 2 strokers with Rootes blowers could go from cold to full load in 0.3 second. Designed for emergency power systems and good for it. Downsides were an insatiable thirst, lots of oil leaks allowing oil to be blown into radiators making overheating a problem if located in deserts as these were. Also very fussy about type of oil. It had to be very low ash, which meant only one supplier on planet or they wore out even faster.

        Lastly, a short life, requiring new rings every 3 years or so. Some were use in fishing boats where they were popular for a while.

      2. David Hicklin

        Re: What could possibly go wrong?

        "Diesel generators don't like starting under load, either. You bring them up to speed, then apply the load, ideally slowly (like a clutch on a manual gearbox in a car)."

        Yup, one of the major design consideration of a DG set is how big a stepped load it can cope with, often needs to DG to be oversized

  33. kmceject

    The Generator needs to start???

    We had a nice system on the roof that fed two different UPS (one for comms and the other for compute) when there was a massive outage on the east coast. The lights in my office and my computer of course went dead. I as practiced grabbed my flashlight and headed to the data center expecting the scattered generator powered lights to come back in a few seconds as the generator powered up. The first shock was the stairway was totally black! All the emergency light batteries had decided to crap out. Second the generator didn't seem to kick in. I reached the room at the same time as the building supervisor/maintenance chief. He and I checked and the UPS' were working.

    We then ran to the basement to check the transfer switch and it was in the tripped position so we should have been up.

    I went back to my office and grabbed a handful of lightsticks (the crack and glow things) while he headed to the roof six flights up. I ran to the fifth floor cracking and pitching the lightsticks into the corner on each landing. They didn't provide much light but enough to hopefully prevent falls. I heard the generator start as I reached the roof and we trooped back down again to verify all was well. Sure enough we were up and running with an elapsed time of about 7 minutes out of our 20 min on battery.

    It turned out when he had gotten to the roof he had checked all the basics, fuel, battery for the starter, etc before manually starting the generator. Then he checked the wiring from the start switch, a high tech, self-installed telephone quad that had been run up from the basement. The 28 gauge copper wire had corroded and snapped where it was attached!

    While we were explaining this to the CEO in the operations center in walked one of the developers with a handful of lightsticks. He explained he had found them in the stairs and wondered who owned them! We had him go put them back as the generator wasn't charging the emergency lights still.

    Suffice it to say a better wire was run and the new quarterly procedure was to pull the mains from the street power to allow a full failover test.

    (Another less fun day was the day we found out the UPS maintenance contract didn't have them checking the batteries or letting us know they were two years beyond recommended replacement!)

  34. kmceject

    UPS go BOOM!

    A side business my firm ran for a while was as a disaster recovery contractor. If a site went down we provided a data center, and a certain number of terminal spaces for a few small firms. I guess it was lucrative on paper but then came the day when we got the call, one of our customers had a major disaster and needed our site. We dispatched a couple of people to assist the pickup of the hard drives - this was in the days of Supermini computers and they were using Argus drives, about 7U and 185lbs each!

    At the site I worked with other staff to clear space and prep work. Soon the team returned pushing a pair of office chairs with these drives precariously balanced on them. The drives had been moved that way about ten blocks in lower Manhattan during a crowded summer day! Relatively quickly we had them mounted and wired to two computers that we had offloaded our clients (thank g_d for load balancers!)

    As we worked I asked what catastrophe had occurred. "The UPS exploded!" I was told. They couldn't really provide much detail beyond that at the time but the next day I found out the batteries had literally exploded in the data center. This had created a hazmat situation and it took a week for the cleanup before they could be moved back.

    In the meantime their users were shoehorned into every square inch we had on folding tables, including in the data center itself! Our site had expanded past the normal delineation of the 'computer room' and some of our servers were in one end of the communications office in a 'U' shape. Someone had put a folding table into the 'U' with two people on either side! This provided another minor disaster when one of them managed to lean on the power switch for the live data feed machine. Where I sat twenty feet away we had a speaker to a simple circuit that triggered if the asynch feed stopped for more than 20 seconds. I immediately checked and found the main system unresponsive and flipped to the backup, for an outage time of about 45 seconds. Still some 2000 customers were rather unhappy about that!

    We found what switch had been flipped, a power switch on a disk drive of all things that was low enough that it hadn't been a problem before but the person sitting in the chair in front of it liked their chair all the way down and the seat lined up with the switch. For the day that user was relocated and the next day custom metal covers were installed. Fun Times!

    They were with us a week and we never stopped sweating! I'm sure that experience was the reason why the next year the firm bought a building and built two data centers, one for us and one for the DR site!

  35. gnasher729 Silver badge

    A short one...

    Three software developers sitting in an office, happily typing away. Me one of them. In comes an electrician saying "I just turned the power off". Three developers: "We know".

  36. ricardian

    Back in the 1960s a large factory installed a fire-suppression system which consisted mainly of sprinklers and incorporated a new innovation at the switchboard (a PMBX1A) which automatically called the local fire station with a voice message on a loop "There is a fire at factory xxx. There is a fire at factory xxx" if a fire was detected - the 999 system couldn't be used as the technology to interact with the operator didn't exist.

    One night the large factory caught fire and the fire-suppression system did its best but the factory burned to the ground before the fire brigade turned up. At the post mortem it transpired that the factory's system worked as designed and telephoned the fire brigade with the repeating voice message. The fire brigade telephone system responded with its own message "The telephone number for this fire station has been changed to 1234578, please replace your received and dial the new number".

  37. Anonymous Coward
    Anonymous Coward

    Carl sums it up.

    https://www.youtube.com/watch?v=G-tCIRJH9p0

  38. Lost in Cyberspace

    Recent power cut - was smug at first

    We recently had a power cut. The computers stated on, the router and Wi-Fi stated on... but the Internet still failed.

    There was no power to the nearby FTTC cabinet nor the 4G mast.

  39. Anonymous Coward
    Anonymous Coward

    Suspect generators aren't much use.

    One place where I worked had the UPSs and a generator, but there were apparently concerns about the possible quality of power from the generator and the fumes it generated. The generator was therefore never tested and the plan was to conduct an orderly shutdown while the UPS was on. IIRC - this was a while ago - it turned out that the UPSs didn't last long enough due to more servers being added since they were installed and a less than orderly shutdown occurred during a power outage instead.

    1. Keith Oborn

      Re: Suspect generators aren't much use.

      Working in Cairo many years ago, the UPS proved to be about 100x less reliable than local mains - MTBF about 30 minutes.

      There was a genny, four floors down in a basement. Trouble was, it had no frequency control, and we had bid disk drives with synchronous motors. The UPS did at least have a frequency meter, Cue lots of shouting down a crackly phone to the guy on the throttle "a bit faster. more. no, slower----". Meanwhile the disk drives were moaning like out of work air raid sirens--.

  40. Sparkus

    A CIO

    you know the one....

    Who refused to invest in UPS protections because (his words) "electricity doesn't have an IP address"........

  41. Anonymous Coward
    Anonymous Coward

    A friend related this story to me:

    Customer service department of a "always on, multiple power supplies, inputs, and cords on all kit" computer maker received a phone call after an earthquake in a far-off place...

    CSRep: "Let's go over the troubleshooting steps to see what we can do to fix your server."

    Customer: "No thanks, I need someone to come out and put it back upright. It fell over, but is otherwise running just fine."

  42. Keith Oborn

    I had almost the exact situation--

    At a Virgin Media data centre -err- "near Reading".

    Gas people severed the two geographically-diverse power feeds to the business park. As is always the case "geographically diverse" tends to fade as you look closers. In this case there was a single road into the site, going over a river bridge. And of course, one cable duct.

    Major Service Outage reports began, updating every 15 minutes:

    "UPS has assumed load. All equipment, including coffee machines but not aircon, is running" - if you think about it, this makes sense. The UPS would only do 15 minutes, so lack of aircon was not a big deal. Lack of coffee for stressed staff, however--.

    "Generator has started. All loads being supported"

    "We don't know how full the tank is"

    "We checked, it's full" (the key piece of test gear was a long stick and a rag).

    1. Anonymous Coward
      Anonymous Coward

      Re: I had almost the exact situation--

      Apologies for the Necropost. National Grid House had similar a few years ago when WPD sliced through the supplies to the HQ. Backups worked as advertised.

      And now NG has bought WPD... Strange times ahead.

  43. Alan(UK)

    Hospital Generator

    My uncle told me about how a generator was installed in the basement of a London hospital after the war. This generator was used to provide emergency power to the blowers on the hospital heating boilers. It was a difficult operation involving hiring a special crane. After it was done, my uncle went and found the stoker and asked him what he did previously when the power failed. The stoker replied that he just opened the furnace doors and changed over to natural draught!

    Bonus comment - I worked in a school which had a complicated lighting system installed after a major refurbishment, one feature was it turned on the emergency lighting in the event of a power failure; the problem was that when the power was restored (which might be a few seconds later) the emergency lights were extinguished but the main lights did not come on because the system had no memory of which ones had been switched on originally. As we complained that being plunged into darkness without warning was dangerous, HR came up with a solution, we were each issued with a keyring with a little LED 'torch' on it - the sort of thing that might have come out of Christmas cracker!

  44. SamJ

    The backup generator worked fine when it was needed - until it overheated due to ....

    I worked for a national trucking company in the US. We had the largest network of System 38s (both on- and off-site) that IBM had sold to date. We had 3 System 38s in their own room. The rest of us used 8088s or dumb terminals to the 38s. One fine summer day it was VERY hot. All the air conditioners in the area were running full tilt - until the power went out throughout the office park. The generators powered on and the switchover was made seamlessly. We sat with emergency lighting waiting for the power to kick back on. Unfortunately, after about 30 minutes the almost brand-new generator died. Why? The generator (on the roof) had been installed with its air intake sucking air out of the exhaust from TWA's emergency generator. Oh well... Can't think of everything. :{

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like

Biting the hand that feeds IT © 1998–2022