back to article Finally, a wafer-thin server... Only a tiny little thin one. Oh all right. Just the one...

Monday is here once more, bringing with it the promise of a clean slate, a fresh week and a mailbox full of problems. Put all that to one side and take solace in another Reg reader's misadventures courtesy of Who, Me? Our tale takes place in 2001 and comes from "Jeff", for that is not and was not his name. Jeff was working …

  1. Anonymous Coward
    Anonymous Coward

    "I also survived not being provided with enough UPSes to cover the load,"

    Did he survive not calculating the required load first, before destroying a UPS by overloading it? Surely the first thing you do when asked to move a large number of systems is talk to the facilities group to make sure that adequate power & cooling is available. Before plugging things in.

    1. tip pc Silver badge

      “ Surely the first thing you do when asked to move a large number of systems is talk to the facilities group to make sure that adequate power & cooling is available. Before plugging things in.”

      We called them a site survey, normally any company doing work for another will look for any opportunity to up the costs of the job, I.e is there enough room in the rack, Ethernet ports, fibre, this server series has a max draw of 500w at 5amps you need a bigger ups, despite the server ordered having max 200w power supply and nothing inside it that could consume all 200w. Seen it many times, been overruled many times and watched, usually government Departments, projects burn money they didn’t need to and also add chunks of time unnecessarily, often keeping the pm contractors on for months extra time but there is no link there.

      1. logicalextreme

        Maybe to get your point across you could invite them to place a wad of cash on top of a UPS so that they can watch it literally burn when the camel's-back server is booted.

      2. Blackjack Silver badge

        Or you could be literally minded and just move them and nothing else.

        When they complain they are not plugged in you answer you are not qualified and they should hire a professional.

        Do stuff like this enough and you get either fired or kicked upstairs.

    2. Stoneshop
      Flame

      Surely the first thing you do when asked to move a large number of systems is talk to the facilities group to make sure that adequate power & cooling is available. Before plugging things in.

      At one of my contracts I happened to get 'floor manager' as one of my duties, as that position had been empty for a good while despite being rather urgently needed: a reorg would see about 80 racks worth of systems move into our computer room.

      That was more than just a bit ambitions, as even after Much Reshuffling of the kit present, and redoing several access aisles, there was room for a mere 55 racks. But with respect to power and cooling, all was supposed to be fine even after the last box would have been moved in and powered up, as both power and cooling were at about 15% load initially and were calculated to go up to about 35% of maximum capacity. Apparently, someone had opted for Massive Overkill on the design specs and managed to get them signed off.

      And indeed, those post-move figures were close to correct, and even showed we had been somewhat zealous about rounding up. So, just when things had settled again after the move, there was a power failure and the no-break dutifully kicked in.

      Still no problem. Really.

      Until ten or so minutes later sensors in the UPS shed started calling out severely elevated temperatures. This was saved by Site Facilities through the use of all the floor-standing fans they could find, pointed at the opened doors of the UPS shed.

      Lessons Learned: yes, the UPS shed's cooling system should also be running off the no-break power.

      1. Peter Gathercole Silver badge

        Even enterprise grade UPSs can have problems

        Late 1980s. Large telecoms development company. Mainframe data centre just outside a small Wiltshire town supplied by overhead power cables. Enterprise grade UPS with diesel backup generators.

        What we learned was that multiple power brown outs during a significant thunderstorm was sufficient to defeat this setup.

        The problem was that each time the power grid browned out, the UPS would kick in, switching to battery and temporarily turning off the air conditioning, which would have been switched back on once the generators started. The problem was that the power resumed before the generators started, so the UPS switched back to mains, and shortly afterwards, the aircon came back on.

        Until the next brown out a few minutes later, and the next, and the next. Over the course of about 2 hours, the batteries became depleted, as there was no time to recharge them after each brown out, and the temperature in the machine rooms began to rise as the air-con was off so much of the time.

        Eventually, even though the UPS should have been able to keep the whole DC running, it was decided to turn off the mainframe and the development and test environments, and halt work for the rest of the day,

        I'm not sure why, but there the manual switch to generator in this setup had been overlooked in the design, which would have been able to keep the data centre running had there been one. But this taught me was that even professionally designed, very expensive UPSs are not a guarantee of continual operation.

      2. Mike007 Bronze badge

        I used to work in an underground secure control room. Basically a load of people in a room with all of the required servers and other equipment to keep the company running.

        The entire bunker was protected by UPSes and backup generators that got tested properly. Tests consisted of a supervisor pulling the main breaker and running on UPS for 10-15 minutes before switching to generators for an hour or so. This test was conducted every single week, with extended tests to empty the tanks every 6 months. The sort of testing regime most people here would love to be able to do.

        Then there was an actual power failure. The UPS and Generators worked perfectly and all systems continued running... Except for the air conditioning units, which not only regulated the temperature but also supplied the meat bags with their oxygen. Apparently they lasted about 10 minutes before the shift supervisor told everyone to activate their respective DR procedures and evacuate to the car park.

    3. Stevie

      Bah!

      "Surely the first thing you do when asked to move a large number of systems is talk to the facilities group to make sure that adequate power & cooling is available. Before plugging things in."

      Well where's the fun in *that*?

      Half the point of any move is everyone standing in the machine room with arms thrown in the air a-la Calvin and Hobbes shouting blue blazes about blame allocation.

  2. Paul 195
    Mushroom

    Shouldn't an overloaded power supply shut down safely, or at worst blow a fuse, rather than bursting into flames?

    1. davcefai

      Yes, but UPSs have always had their own rules.

      1. Jens Goerke

        Like shutting off when the remaining minutes countdown reaches 29.

        1. Hans Neeson-Bumpsadese Silver badge

          Like shutting off when the remaining minutes countdown reaches 29.

          In my experience, once it has beeped to indicate that it's now active a UPS generally shuts down in the time it takes to utter the first four words of the sentence "I'm glad I've got UPS"

          1. Steve Davies 3 Silver badge
            Coat

            I always wondered why...

            UPS vans were brown...

            Now I know it. Brown hides all the [redacted] in your pants when things go TITSUP.

            Mines the one with a roll of soft tissue in the pocket just in case things get a bit short.

          2. logicalextreme

            I've got stuff plugged into the surge-protection-only side of my home UPS because it arbitrarily shuts down the UPS side far, far more frequently than my power cuts out and it had become actively dangerous to my NAS.

            1. Claptrap314 Silver badge

              My UPS lasted more than a decade before the battery went out. I went to the same manufacturer for a full UPS. After two years, the "uninterruptible" side is dropping monthly. Anyone know a reputable manufacturer for home UPS?

              1. DS999 Silver badge

                I've always had good luck with APC. I've used the same 1500 watt APC UPS for about 15 years now. Had to replace the battery pack about 5 years ago when it started only having 5-10 minutes of runtime, that brought it back up to well over an hour.

                1. logicalextreme

                  Mine's an APC too actually, about ten years old now. I suspect from researching that the root cause of mine is that the battery needs replacing, but it's a very odd way for it to manifest (it does run for a good few minutes on a test). I'd be tempted to go with them again but I'll be asking them for reassurances that it won't do what this one did before I buy another one.

                  I mostly have pretty good luck with hardware though so it's fair for at least one thing to be borked on me.

                  1. TRT Silver badge

                    I used to use APC, but I've changed to Eaton because the APCs used to chew through expensive battery packs with frightening regularity.

                    1. logicalextreme

                      Cheers, I'll check them out when I get round to deciding I want one again.

                  2. DS999 Silver badge

                    Part of it may have something to do with how much you load it. I got a 1500 watt UPS to get more runtime and so I could hook up the monitor and actually do stuff during a power outage (which are luckily pretty rare where I live, but its the midwest US so huge thunderstorms come through all the time about 8 months of the year)

                    The total load of my PC+monitor+wireless+switch+modem is generally under 120 watts. People who have a gaming PC with a power hungry GPU and overclocked CPU probably put more stress on the UPS.

                    1. logicalextreme

                      From reading through a few forums, I think it may just be a duff unit or battery, but by the time I'd pinned it on the UPS it was out of warranty. It's 750W, and was only powering an old 4-bay Netgear NAS and the router (it was also powering a low-powered PC for a while but it seems to go clunk whatever's plugged into it).

                  3. hypnos

                    Mine's a very old APC also

                    Mine's an APC, about 18 years old, 800kVA, enough to still keep a modern PC and its monitor alive (thank god for LCDs) for several minutes and an comfortable orderly shutdown.

                    I've had to change batteries twice, at 8 and 16 years, 50 EUR a pop, but why change a good thing?

                    Takes bog standard house alarm batteries, can find them anywhere.

                    Only grudge is that APC dropped support for the serial interface very quickly (in the mid-late 2000s, I think it was with Win2K) so no supervision from the PC.

                    1. Stoneshop

                      Re: Mine's a very old APC also

                      Mine's an APC, about 18 years old, 800kVA, enough to still keep a modern PC and its monitor alive (thank god for LCDs) for several minutes and an comfortable orderly shutdown.

                      With an average PC and an LCD monitor an 800kVA UPS should be able to keep them powered for weeks, not mere minutes. Is it perhaps time to change the batteries again?

                      1. Anonymous Coward
                        Anonymous Coward

                        Re: Mine's a very old APC also

                        I suspect hypnos meant a 800VA UPS rather than 800kVA - one's an under-the-desk unit, and the other is a row of power panels!

          3. Stevie

            Bah!

            Least fave job: dismantling the chassis of a UPS so I could remove a dud battery that had grown a bunion on one side.

            Reminded me of all the times I went to work on my old TR6 blithely assuming that this time the Haynes manual would have all the required steps in it, and wouldn't require an extra day for all the things the double-barrel-named idiot who "co-authored" it forgot to mention.

            1. Tom 7

              Re: Bah!

              TBF I always imagined a Haynes for a TR6 would be a bit like a Haynes for a Death Star - very unlikely to have been tested on the real thing. And even if it had been tested on the real thing it would be the US version and not involve steering.

              1. Stevie

                Re: Bah!

                The problems were all from the methodology, and unfavorable comparisons to the first one written - the Mini. Paddy Hopkirk co-wrote that and *he* rewrote the book on how to work on Isigonis's little gem.

                The process on car #2 onwards was:

                Dismantle car into smallest sub components that make sense.

                Re-assemble car, writing down what you did as the assembly instructions.

                Write everything in reverse order and call that the disassembly instructions.

                And finally:

                Don't mention needing a mechanic's pit until the last possible moment.

                This is how, for example, it is possible to write instructions for removing the prop-shaft of the TR6 as "Remove the transmission tunnel cover, undo the four bolts securing the front flange to the gearbox, undo the four bolts securing the rear flange to the differential and lower the prop-shaft to the floor."

                A. Friend: "Have you lowered the prop-shaft to the floor?'

                Me: "Nope. I have lowered the prop-shaft to the chassis cross-member and the twin exhaust pipes."

                My choices then were remove the exhaust (BAD IDEA) or loosen the engine mounts, remove the transmission mount securing bolt, jack up the gearbox and swear the prop-shaft out of the car.

                To rub it in, said prop shaft was only about 18" long. It was very dispiriting how it resisted removal.

                That said, most jobs on the TR6 were super-easy if you had tools. I could swap out the axle UJs, all four of 'em, in a couple of hours. Up at 9, have tea, swap out UJs, clean up, in pub at lunchtime.

            2. ICL1900-G3

              Re: Bah!

              Ah TR6, forever a work in progress. I had one and *everything* that could go wrong with a car, did. Including a wheel coming off when I was living in France, diff exploding, catching fire... still wish I'd kept it.

              1. W4YBO

                Re: Bah!

                1978 Spitfire 1500. I couldn't tell whether the top was down or up in the rain. Broke my right foot on the doorsill while pushing it out of the road when the differential blew up.

                Currently trying to convince my wife that one would be a great project car.

              2. Stevie

                Re: still wish I'd kept it

                *nods*

                Sold mine 6 months before they became collectible.

                Story of my life.

          4. Anonymous Coward
            Anonymous Coward

            RE: UPS generally shuts down in the time it takes to utter the first four words

            Yours did good. I don't think mine lasted long enough to even beep the one time.

            Now I try to remember to test them more often to see if they need a new battery or the UPS needs replaced.

            That's right I need to get a new battery for the one and to test the two cheep ones at home to see if I should just replace them.

            1. Peter2 Silver badge

              Re: RE: UPS generally shuts down in the time it takes to utter the first four words

              The problem is that if you plug dual PSU servers into both the UPS and the mains they draw ~50% power from both. So if you load the UPS up to "capacity" as shown on the front of the UPS then as soon as the power goes then you get double the demand on the UPS.

              These things of course come out in testing. Or in experience, if you don't bother to do the testing.

              1. TRT Silver badge

                Re: RE: UPS generally shuts down in the time it takes to utter the first four words

                I take it you mean in an n+1 scenario if you put one mains supply into the UPS and the other into the mains then you get double loads. You're not supposed to use the load indicator to gauge the UPS capacity. It should only be done on the electrical specs. If you run both mains supplies off different UPS then you won't get exactly the same runtime from both units so eventually one or the other will flake first and you'll get double load on the other. By then though you should have shutdown your server! What I've not yet come across is how systems can deal with dual UPS for remaining time shutdown.

        2. Jou (Mxyzptlk) Silver badge

          This is why I don't take the UPS "remaining time" or "Battery percentage left" into consideration.

          The rule is rather: If you are on UPS for two minutes, and you have no diesel kicking in after two minutes, start the shutdown sequence.

          Only if there is a very good plan in writing and the equipment in question is REALLY 24/7 I go for other options.

      2. AlbertH

        Unique Rules

        Italian UPS machines have their own really idiosyncratic set of rules. They seldom work properly.

  3. Andytug

    Not blown one up, but it helps if they are live....

    Following a planned power off for electrical work, 06fartooearly on a Monday, I come in, fire up all the network kit, lights come on in the right order, everything goes green as it should, lovely. Go to the server cabinet, start them all up, all the usual whirry noises, fans, lights, etc. Fantastic. Then my ears pick up a regular beeping noise, one not normally heard. WTF?? While my half awake brain processes this, all the servers shut down and there is silence again. Mild panic ensues. Press power again, nothing. Beeping continues. Finally look at UPS at the bottom of the rack....oh-oh, red lights all round. No power. Phone electrician on call...…

    After the inspection one of the RCDs hadn't been replaced correctly, so the entire rack had no power. All the servers fired up on UPS power, which then promptly shut them all down gracefully before it ran out of battery power.

    Lesson learned, check everything for red lights. Even the stuff right at the bottom...….

    1. J.G.Harston Silver badge

      Re: Not blown one up, but it helps if they are live....

      "Even the stuff right at the bottom...…."

      My knees don't bend that far. Anybody putting stuff at floor level is declaring that it is superflous to requirements.

  4. Anonymous Coward
    Anonymous Coward

    Do you recall the smell of burning UPSes in the morning?

    No, because I've also gone for ones with overload protection...

    1. Captain Scarlet Silver badge

      Re: Do you recall the smell of burning UPSes in the morning?

      Yes because of an internal fault in one of our UPS's, it tripped the utility power and went to battery.

      We got alerts immediatly via email that utility power was down. It was connected to a fully populated IBM Blade Centre I went to check as it covered 1 of the 4 2KW powersupplies and if it went down we would have no power redundancy on the left hand power domain (Which also powered internal fibre switches and management).

      I went and stabbed the button to put power back on as I couldn't see anything wrong.

      Sparks immediatly came flying out and the 16AMP C19 cable melted before it re-tripped.

      Never been so scared in my life (Circuit had 50AMP fuse).

    2. macjules

      Re: Do you recall the smell of burning UPSes in the morning?

      + 1 Was wondering if someone might mention a simple thing such as overload protection.

      1. Cynic_999

        Re: Do you recall the smell of burning UPSes in the morning?

        Circuit breakers and fuses do not usually provide protection against a small but persistant overload. They must be rated to allow for brief overload situations (e.g. at startup). But if the overload lasts for a long enough time, components can get hotter and hotter until they burst into flames or create a fault (e.g. go high resistance) which results in a sudden massive power overload in the component without any current increase. You can provide thermal protection and/or slow-blow breakers or fuses, but there will usually be a particular overload level that will fail to trip such protection while creating a hazardous situation over time.

    3. Tim99 Silver badge

      Re: Do you recall the smell of burning UPSes in the morning?

      Sod’s law: A previous post...

    4. rototype
      Mushroom

      Re: Do you recall the smell of burning UPSes in the morning?

      Yes, or more accurately an exploding one...

      This happenned one time when they were extending the warehouse at the firm I worked for and the JCB bucket found that the plans for where the water and power feeds came in were both wrong and they were in the same trench as the JCB was...

      Most stuff survived, but he UPS for the main system (since it was alone on it's circuit) took the brunt of the surge and promptly sh*t itself to protect the main unit. Pretty much destroyed the power input stage of it but fortunately we had 24/7/4 hour cover so we had a replacement unit in place by lunch time...

      A side story to this was the face that one of our crew was having his annual review at the time, so the other 2 of us were in the office when this occurred. First thing we noticed was the screens on the PCs were doing strange patterns so I got up to check the main unit oin the next roomand almost as soon as I set foot in there the UPS exploded. Cue the Boss and our colleague who was being reviewed running in expecting to see me with a charred screwdriver in hand, sadly disapointed they had to accept my explanation of what happenned.

  5. Anonymous Coward
    Anonymous Coward

    I used to work for a CCTV distributor, as well as selling kit to installers we provided pre & post-sales tech support, which meant we had to have one of everything in our demo suite.

    We kept warning the powers that be that we were overloading the available circuits, but the PHBs kept ignoring us.

    That is until the advent of networked CCTV with a need for beefy dual Xeon based servers to process dozens of HD streams.

    We managed to blag a server and SAN off one of our vendors and we made space in one of the racks, slid them into place, plugged the power supplies for server and SAN the into the power distribution unit, turned it on and got post, then the SAN spun up and there was a very big bang and all the lights went out.

    Not just our lights, as we went outside to see where the acrid smoke was coming from we saw all the trading estate was out, and that smoke, it was from a now deceased sub-station.

    Yep, we'd not just overloaded our circuits, that Dell MD1000 array was the straw that broke the sub-station's back.

    Electricity Northwest brought a mobile generator round a couple of hours later to get the estate back up & running, then they delivered a much bigger substation a couple of weeks later, which we managed to knock out the next year, but that wasn't completely our fault, the company next door had a new machine delivered that sucked way more power than our entire building, no idea what the machine did, but the company only made printer ink and laser toner...

    1. Doctor Syntax Silver badge

      Elec-trickery isn't the only utility that can get knocked out by industrial kit. Some years back a dyer had a new gas-fired boiler installed and until they got their own supply they connected it to the domestic supply main. Then fired it up the first time in the middle of winter. The initial surge took the pressure down so low that pilot lights went out. It took a whiles for the gas engineers to work out why the pressure was so low in the mains.

      1. Claptrap314 Silver badge

        That's one of the scariest stories I've heard.

        How to blow up 1000 houses all at once...

        1. Stoneshop

          How to blow up 1000 houses all at once...

          Pilot lights are supposed to have a shutoff safety valve controlled by a bimetal strip or a sealed oil-filled container that expands when heated if the pilot flame goes out, the valve closes and the pilot light extinghuishes. To light it again you'd have to press and hold a button that opens the valve manually, ignite the light and keep pressing the button until the safety has heated up again keeping the valve open.

          1. Doctor Syntax Silver badge

            Re: How to blow up 1000 houses all at once...

            Or a thermocouple which seems to be the usual option on domestic properties.

          2. Extreme Aged Parent

            Re: How to blow up 1000 houses all at once...

            The operative word is 'supposed' it doesn't always happen...Sodds Law

          3. Claptrap314 Silver badge

            Re: How to blow up 1000 houses all at once...

            Cool. Learned something. But, as others have noted... "supposed to". I've personally witnessed failures of that.

    2. Kubla Cant

      a new machine delivered that sucked way more power than our entire building, no idea what the machine did, but the company only made printer ink and laser toner

      They doubtless needed a very high-power machine to count their profits.

      1. logicalextreme

        Maybe banks should give printer and ink companies a taste of their own medicine and reject all fund deposits that aren't entirely made up of £1 notes from the Royal Bank of Scotland.

        Optionally freezing the account if they try anything different.

        1. Robert Carnegie Silver badge

          Only accepting Scottish £1s - A printing company might have their own innovative answer to that.

          I'm in Scotland and I haven't seen a £1 note for quite a while, are they making them still?

          1. logicalextreme

            Ah. Good point.

            Just looked and they were last regularly printed in 2001 but are still technically in circulation, so they've probably mostly either disintegrated or are in the hands of collectors.

    3. Anonymous Coward
      Anonymous Coward

      Blowing up substations

      When I was a kid, my mum was trying to print something on our dot matrix printer and couldn't work out why it wasn't working.

      I noticed the printer was offline; I pressed the "On Line" button, and that instant the substation across the road exploded and the associated surge temporarily knocked out the power for a radius of about 5 miles (and our power for the rest of the day). My parents still blame me for it :-)

      1. logicalextreme

        Re: Blowing up substations

        It's always interesting hearing people's stories about how they first got into IT.

  6. Mast1

    Smoke-induced aphasia......

    Many years ago I was debugging a very expensive bit of electronic kit with a few design issues on one particular board. eg under-specced ceramic wirewound resistor that gave an instant blister burn when lightly touched, and could melt its solder attachment to board. All hinting that a hefty power supply was lurking behind the scenes. Cue a more senior work colleague rushing through a revision and with a flourish hits the "on" button. My slow brain was registering that such haste was possibly not appropriate. The lights appeared to come up, but a few seconds later the magic smoke fairy emerges from a transistor can: actually probably more than one fairy. Although I was the first to observe it, the words of alarm froze in the brain. The only "f"-word was "aphasia".

  7. Anonymous Coward
    Anonymous Coward

    Almost IT...

    Back in my student days (in nineteen-hundred-and-frozen-to-death) I earned part-time cash doing TV repairs. The workshop I was normally in was quite basic but, as long as you remembered not to touch any metal without first checking, it all worked well (and no accidents). I was loaned out to another shop for a few days, as their regular repair tech was on holiday. All went well until, a couple days in, I needed to check it the line driver was working on a TV. This was back in the days of valves and the usual check for the line-drive was to draw a spark from the cap of the HT valve using an insulated screwdriver - a 1/4" spark told you all was well. Unfortunately, this workshop had all its kit running off an isolating transformer and ELCB. The small current being drawn for the spark test (which passed through me to earth, totally pain and harm free) was enough to trip the circuit and plunge the whole shop into darkness. Apparently, they had a separate circuit that bypassed the isolated system when needed, but nobody thought to let me know.

    Fortunately, nothing permanent and all back running ten minutes later. Their normal tech, it turned out, didn't know about the quick line-drive test and rigged up an AVO each time he needed to check; effective but we only did that if we needed to know the actual voltage for further diagnostics.

    1. MrNigel

      Re: Almost IT...

      ....reminds me of 1978 when I worked for British Relay Special Services Division. As an apprentice part of my initiation was to test if a 100 volt line circuit was live using my tongue. Oh, how the engineers laughed at me.

  8. Pascal Monett Silver badge

    "We ran out of available outlets."

    That is a sure sign of insufficient planning.

    I'm not an admin, but I'm pretty sure that these days, when you're planning to install/move several servers, the checklist includes enough power for each and UPS for all and you don't go through with the install/move until you have those boxes checked.

    On the other hand, as always, things like that have had to happen in order for today's admins to have that complete checklist to sign off on. Because there are very few humans who are capable of thinking everything through and envisioning all possible issues. We learn to plan because we've hit a snag, or witnessed a UPS go bang.

    1. Anonymous Custard
      Headmaster

      Re: "We ran out of available outlets."

      As the saying goes:

      "A wise man learns from his mistakes, but a truly wise man also learns from other peoples..."

    2. Doctor Syntax Silver badge

      Re: "We ran out of available outlets."

      "That is a sure sign of insufficient planning."

      What do you mean, insufficient? It was perfectly sufficient when it was done 10 years ago.

      1. Nunyabiznes

        Re: "We ran out of available outlets."

        It probably was sufficient until the Facilities and/or beancounters got involved and cut every other outlet out of the design.

        1. logicalextreme

          Re: "We ran out of available outlets."

          Well sure; they've been to Tesco, they know how much extension leads cost.

    3. Antron Argaiv Silver badge

      Re: "We ran out of available outlets."

      Oh, I wish I could tell you about the project I'm currently working on.

      1. Anonymous Custard

        Re: "We ran out of available outlets."

        Oh go on, we won't tell anyone. And they'll even give you a shiny new name too.

        You know you wanna...!

  9. Anonymous Coward
    Anonymous Coward

    Been there, done that...

    Got the soot stained t-shirt...

    In this instance it was a series of 1000w output UPS's with each server connected to two through redundant PSU's running in parallel rather than one hot, one cold.. Apart from that one new server.. Which wasn't the 420watt PSU as the rest but a shiny pair of platinum rated 750watt PSU's.

    It all ran fine until a power outage whilst I was in the server room moving other kit around. Lights go out, aircon dies, UPS goes bang with smoke, fire and flame (mercifully brief but enough for it not to be as dark as it should have been).

    Cue panic to shut down various other servers before the other UPS died from its rapidly depleting power reserve (when you can see the leds dropping before your very eyes you know you have issues).

    Small mercy that it didn't take the DB server with it, just the staff proxy server so no great loss.

    1. John Brown (no body) Silver badge

      Re: Been there, done that...

      "UPS goes bang with smoke, fire and flame (mercifully brief but enough for it not to be as dark as it should have been)."

      Oooh, a UPS with built in emergency lighting?

      1. Anonymous Coward
        Anonymous Coward

        Re: Been there, done that...

        If the light is lit, it's definitely an emergency!

    2. logicalextreme
      Holmes

      Re: Been there, done that...

      Pfffft. Parallel? One hot, one cold?! Amateur hour.

      At my last place the mains supply to the building was cut off. This was the first of quite a few loud clunks over the coming weeks that ended up needing a hefty sort-out by the electricity company. The backup generator outside was, of course, utterly devoid of fuel and had been since we'd moved into the building two years previous. All of the on-site servers and switches, it transpired later, had also gone down.

      After the second time this happened I asked a similarly-disillusioned bod from infrastructure if they reckoned manglement were going to splash out for a UPS fairly soonish, to which they said with a grin, "Oh don't worry, I just found out that we've got two of them, and they worked both times. For twenty seconds. Looks like when we moved offices they carefully balanced everything across them, then plugged one of them into the mains and the other one into the first one".

      Series, mate. That's where it's at.

  10. Ross Luker

    PSU fun

    My favourite ever support call to Dell: "The PSU on this server's failed, can you send a replacement?"

    "Sure, we just need to run some diagnostics and collect some logs...."

    "Don't need that, just send a new PSU"

    "Well how can you be sure it's the PSU without collecting the logs?"

    "Cos when I connect a power cable blue sparks and smoke come out of it"

    "....OK, I'll just arrange a replacement now...."

    1. Saruman the White Silver badge

      Re: PSU fun

      At the start of this century I was a part of a team developing a highly specialised network simulator for a customer. The network that the simulator supported also included RF (SATCOM) components, and the simulator was designed to calculate the intermodulation products that all of the various RF carriers generated. The simulator was actually quite a beast; written in Java with a lot of work put in to optimise it, but with some settings you could more-or-less see the JVM (and the PC it was running on) scream.

      The customer originally was planning on installing the simulator on a desktop machine running a mid-range single-core Intel processor. However when they saw the simulator running in the lab they decided to upgrade the target platform to use a dual-processor SMP motherboard with higher-specification Intel processors. This was 100% a customer decision, we did not recommend the upgrade, or (to be honest) gave it any thought at all - our responsibility stopped when we delivered a working simulator.

      Eventually the day of the initial delivery dawned, and I went to the customer site, installed the simulator, and ran the delivery checks that verified that everything was OK. Customer sign-off of the delivery was obtained, and everyone was happy at that point.

      About a week later we got an e-mail from the customer say that the simulator was causing their desktop machine to blow up! Off I went to the customer site, to be greeted by a very irate representative who proceeded to demonstrate how, when they ran a test scenario on the simulator, the desktop machine it was running on would shut itself down with extreme prejudice after about 20 seconds. I checked the simulator software that we had delivered - no problems. I then asked, casually, whether they had upgrade the desk machine's PSU when they upgraded the processors - I got about 30 seconds of silence before a muttered "Oh shit ..." was heard.

      We never had any further problems with the delivered system.

    2. John Brown (no body) Silver badge

      Re: PSU fun

      "The PSU on this server's failed, can you send a replacement?"

      The Dell rep did exactly the correct thing based on the information given, Hell desk people have to deal with the full gamut of callers from total numpties to people who actually know what they are talking about. S/he did the right thing in not assuming you were correct without garnering further information from you.

      1. Stoneshop

        The Dell rep did exactly the correct thing based on the information given,

        One of the smoothest service calls I've had to make, regarding a deeply problematic intermittent network error.:

        * Call service hotline, get contract status verified, get connected to 2nd line tech for problem analysis.

        * Tech notes down my problem description, preliminary analysis, corrective actions taken and current problem status.

        * Tech verifies several actions that should have solved the problem, or at least narrowed down the source, as having been taken.

        * Tech moves problem up to 3rd line support.

        * Shortly after I get a call back from 3 line support: an honest-to-goodness greybeard and ex-colleague. "So you've already eliminated $this, $that, and $thatotherthing as the problem source, and I can see from the logs that you've done so correctly. Means we're into Really Interesting territory. I'll be there in an hour and a half."

        1. MrBanana

          Re: The Dell rep did exactly the correct thing based on the information given,

          This is the way it should work. But it is very rare for things to move along like this, or in a timescale that isn't glacial, or working with a diagnostic data transfer system that just doesn't work, or an inability for lower levels of support to know that you have a great deal more experience and knowledge, or... all the other painful things.

          It's a nice story but, for 90% of support calls, it is just a story, not reality.

  11. Sequin

    I worked for a large government department and our team were the first to develop PC based solutions, the department's vast majority of work being done on ICL mainframes. We bought our own development server (Novel Netware), wired up our own network, and got ourselves a UPS to protect the server. We used to test the UPS every 3 months, doing a controlled shutdown, and everything was hunky-dory.

    About 18 months later we moved to a new building and our server was moved into a new server room and plugged into the new network. We got it up and running very quickly and things were going well. Until, that is, we decided to run our regular UPS test. We switched off the mains supply, then watched as the software on the server detected the power out and shutdown everything gracefully. At this point screams came from another team that their system had gone down.

    When putting their server into the server room, that had spotted a spare outlet on our UPS and decided that it would be a good idea to hook into it, but they had not made us aware of this. Doh!

    1. Steve Davies 3 Silver badge
      Mushroom

      re: their system had gone down

      Ain't the laws of unintended consequences wonderful!

      Bet they didn't do that again. The question that remains is...

      Did this other team get their own UPS or did they freeload off another one. I've seen that happen before.

  12. This post has been deleted by its author

    1. John Brown (no body) Silver badge

      Re: "cartoons depicting lazy American workers"

      Well, to be fair, it's racism. Just not the sort of racism that makes it into US and western news feeds :-)

      1. Anonymous Coward
        Anonymous Coward

        Re: "cartoons depicting lazy American workers"

        "American" is a race now?

        1. Anonymous Coward
          Anonymous Coward

          Re: "cartoons depicting lazy American workers"

          Politically, it's a race to the bottom

        2. John Brown (no body) Silver badge

          Re: "cartoons depicting lazy American workers"

          Apparently so, to the rabid PC brigade. Racism is any kind of stereotyped insult against anyone who is not "us", where "us" is the groups being offended on behalf of "them".

        3. Robert Carnegie Silver badge

          Re: "cartoons depicting lazy American workers"

          Nationality isn't a race in UK law (someone claimed anti English discrimination in Scotland and got hee-haw for it), but "American" isn't really a nationality - it is two point one continents after all.

          "Race" isn't really defined except as something to discriminate on, unless you're discriminating on something else.

          The theme song to "American Dad" refers to "the American race" as if there is one (and only one that matters), but this isn't a documentary.

          1. Anonymous Coward
            Anonymous Coward

            Re: "cartoons depicting lazy American workers"

            I have turned down several interviews for positions in Scotland precisely because of anti-English racism there. My (now sadly late) wife had 7 children from a previous marriage and spent 25 years in Scotland. The kids were regularly beaten up at school for being English.

    2. Anonymous Coward
      Anonymous Coward

      Re: "cartoons depicting lazy American workers"

      When I was working for what remains of the British Aerospace industry, we had a visit from a couple of gentlemen from across the pond. Apparently one of their other projects had included working with a company from the land of the rising sun, and some of the visiting employees had been heard to repeatedly make nasty comments comparing the lazy and incompetent Americans to the diligent and hard-working Japanese workers.

      Bearing in mind these were less enlightened times and the Japanese management refusal to stop the comments (did the American workers arrive an hour or more before their shift started? Did they show "proper respect" to their bosses and the Company?), some of the factory floor workers made some "motivational posters" of their own - the words "Built by lazy and incompetent Americans, Tested by diligent and hard-working Japanese" - with a picture of the Enola Gay...

      1. Admiral Grace Hopper

        Re: "cartoons depicting lazy American workers"

        It was a few months after we were sold to a Big American Corporation that I saw a picture of Pearl Harbour used as wallpaper on a colleague's PC. When asked why, I was told that it reminded him of his current project, which largely consisted of Yanks running round putting out fires and scooping up the dead and injured having actively ignored all warnings for several months.

  13. AndyMTB

    Zombie Hand

    Pre-dating the days of IT (well, OK there were a few BBC 1st gens knocking around) when I were just a lad earning some extra pocket money working weekends and holidays at the local car-wash (yes, it was called "Jeeves"). Me and my mate were considered trusty enough to open-up, cash-up and even drop the takings off at the bank when the manager was away. The start of the day involved powering up the whole plant, separate relays for brushes, rollers, heaters etc, and a certain order these switches had to be turned on/plugged in. Bit of a faff running back and forth, so Pete and I had worked out a system where he did one end and me the other, suitably timed so that the power-up sequence was maintained via synchronized yells. Worked really quickly and efficiently, until the day we managed to get out of step. I plugged in the last connector but unfortunately it was already live at this point, resulting in a big, black smoke-filled bang. Needless to say I jumped back a socially-distanced 6 feet from the billowing socket, but with a charred and completely crusty-black hand.

    "This is going to hurt soon" thinks I, as Pete comes scampering along to investigate the loud noise. He goes a bit weak-kneed and has to sit down when he spies my hand, which surprisingly was still attached to my arm and hadn't started to register on the pain threshold yet. I tried to flex a finger, and the skin cracked along the joint. I say "skin", but it became obvious that my hand wasn't completely barbecued, it was merely covered in a thick layer of carbon that had spouted out of the electrical socket. A few dabs with another finger to confirm this theory, then a swish under the tap confirmed that everything was indeed still in working order.

    And the surprising thing was, after we'd re-set the master relays and re-powered up it all worked, even the exploding plug. Lessons learnt - until Pete and I had a go at fixing the VI form vending machine, but that's another story.

  14. My other car WAS an IAV Stryker

    Partial blackout story (not UPS, though)

    Just a week ago Sunday, while driving from home to church to play drums -- a distance of only 3 miles -- the missus called and said there were power issues at home just after I left but after a few flicks up & down it stabilized.

    But the church building was a different story. I walk in to the sanctuary where only half the lights are working and all the outlets on the stage-left side* are dead. This included the remote audio I/O unit that connects to the main PA board by Ethernet (Allen & Heath system), the PoE that runs two A&H personal-monitor units (one is mine), the Roland digital drums I play (but the church owns them), the Hammond organ, and all the musician's powered monitor speakers, plus the main monitors at stage front.

    And on top of that many of the HVAC units were dead, so the environment was not horrible but temperature was slowly climbing; the building's internet was also gone so the Facebook livestream was in question.

    This being the first Sunday we were trying two services to keep attendance down and social distance up, this was NOT a good omen overall. (The livestream was going to use the second service.)

    Lighting aside, the main A/V booth survived -- graphics computer, audio, and main speaker amps -- as did the large flat-screens (two for the congregants and one in the back for the praise team). We lost organ, drums, and audio interface to the separate digital keyboard, but still had piano (direct-wired mic), guitar (wireless pickup to the main board), horns, and my personal pair of bongos and were a little more "unplugged" for the 9:00.

    Seeing as the outlets actually on stage were working, which are only occasionally used for LED PAR wall-wash lights, someone grabbed some extension cords and we got the drums, monitors and audio/PoE units working again just before the second service. They still did the livestream, probably by turning on tethering on someone's phone; I heard from the missus that the feed wasn't as stable.

    Here's what really happened: the building has a 3-phase utility connection at the main road; different parts of the building tap into different phases. The power flicks** took out 1 phase blew the fuse on the roadside utility pole. Thankfully we have a congregant sparky (electrician) who quickly diagnosed it and helped turn off any 3-phase loads (like the HVAC) quickly. (I think they tried some cross-over switches trying to restore the dead circuits by tying them to a working phase, but that kept tripping also.) In the end, nothing in the building was harmed and it was all-systems-normal when I was there Thursday evening for practice.

    * Yes, I worked it to say "left side" as an allusion to Marvin and you found it. Pat yourself on the back, you nerd.

    ** I was also watching our utility's outage reports via their own app. I think there was an issue with a substation very close to home and it took out some nearby neighborhoods completely. Wouldn't have been the first time. I blame the squirrels.

    1. Cynic_999

      Re: Partial blackout story (not UPS, though)

      Act of God

    2. Anonymous IV
      Mushroom

      Re: Partial blackout story (not UPS, though)

      > The power flicks** took out 1 phase blew the fuse on the roadside utility pole.

      Aha! A phase worse than death...

    3. H in The Hague

      Re: Partial blackout story (not UPS, though)

      "The power flicks** took out 1 phase ..."

      In my, fortunately limited experience, when 1 phase goes it is best to expect the other 2 to follow soon. To fix a distribution board or substation the sparky attending to it usually has to power the whole thing down.

      1. Stoneshop

        Re: Partial blackout story (not UPS, though)

        In my, fortunately limited experience, when 1 phase goes it is best to expect the other 2 to follow soon. To fix a distribution board or substation the sparky attending to it usually has to power the whole thing down.

        About two years ago I was sitting at the workbench in my study/room/den when I saw the lights briefly dip and heard a short *Whonk* of the UPS kicking in, then switching off again. Turning around I saw one of the ceiling lights being off[0], and part of my workbench had lost power but the network cabinet was still up and the UPS was on passthrough. Opening the circuit breaker cabinet I saw the power monitor display just one phase; it was clear that whatever had happened had taken out two phases, and it was equally clear that it was some external event as otherwise there would have been audible and olfactory indications emanating from that breaker cabinet. A quick probe confirmed that two of the three phases in the incoming feed carried only a very feeble voltage, not the normal 230V AC. Calling the energy supplier confirmed that a) it was indeed external and b) they already knew the culprit: a JCB on a building plot 200m away.

        A short while later two sparkies turned up at the substation[1], notified me that they'd have to cut all the power while checking out the transformer and other stuff, which they expected to take two hours. It actually took three and a half, because the cable the JCB had hit was rather prehistoric and not shown on their drawings.

        [0] Of course every room is fed from two groups on separate phases.

        [1] right next door.

        1. Pascal Monett Silver badge

          Re: not shown on their drawings

          I'm guessing the amount of stuff not shown on drawings is properly terrifying when you think about it.

        2. Anonymous Coward
          Anonymous Coward

          Re: Partial blackout story (not UPS, though)

          "the cable the JCB had hit was rather prehistoric and not shown on their drawings"

          Thereby absolving both the JCB driver and his/her employer of all blame

          Phew! Sighs of relief all round

          1. Anonymous Coward
            Anonymous Coward

            Re: Partial blackout story (not UPS, though)

            Someone still got a right royal for that one - a lot of the cost of preparing a site for works is carrying out utility surveys. You start with the ones that are shown on the drawings, and work out where they actually are compared to where the drawings say they are. Then, you go hunting. The JCB driver wasn't to blame, but whoever signed off that excavation is in a steaming pile.

            1. Caver_Dave Silver badge
              Flame

              Re: Not on the plans

              My brother runs a team that fix gas mains leaks. He says that they often find large electrical cables where there are none on the plans - and some do not show up on the sensors! Working on the leaks they are in full fire suits, breathing apparatus and using hand tools only, so they tend not to damage cables.

              He tells a great story of when a back-hoe driver was trying to dig under a 6' high pressure gas main (enough for about 5,000 homes). Obviously, he shouldn't have been digging there and he lifted the main slightly, cracking the top surface. The pressure was so high that sand bags thrown onto the leak were flying 70-80' in the air. In the end my brother got almost everyone from the building site to surround the leak and all of them (>20) throw sand bags at the same time! Sand bags were tied to the bucket of the back hoe and this was then placed on top of the convulsing pile to seal the leak enough to allow repairs to start. (Emptying the pipe would take days and half of the large town would have been without gas in the middle of winter as so it was worked on 'live'.)

              I'm glad I'm an office boy!

              Icon: as my brother sees this far too often.

            2. Stoneshop

              Re: Partial blackout story (not UPS, though)

              Someone still got a right royal for that one - a lot of the cost of preparing a site for works is carrying out utility surveys.

              Airborne pigs around you much?

              This was removing an ex farm shed from a plot where two (private) houses were going to be built. They'd already found and disconnected the feed that had been in use until then which came from the farm building meter box; this was a much older one, clearly way older than the shed, that ended, well, somewhere under the shed floor roughly in the front 1/3rd branching directly from the substation cable under the access road (otherwise it'd only have taken out the main fuses in the farm building). So apparently there had been another farmhouse there but only our 75-year old neighbour vaguely remembered it; no-one else did.

          2. keith_w

            Re: Partial blackout story (not UPS, though)

            Someone expects drawings to be accurate? Hereabouts, when digging is expected, someone comes around with a detector and spray paints the ground showing where wire and pipes are located.

      2. swm

        Re: Partial blackout story (not UPS, though)

        I was sitting at home when it felt like a really heavy truck was passing and the lights dimmed a couple of times. I thought a truck had run into an electrical pole. Turns out a few railroad cars were dropping off of an elevated track a few blocks away taking out power wires, crushing parked cars etc., and narrowly missing a car driving under a railroad bridge. Fortunately, no one was injured or killed.

      3. Jakester

        Re: Partial blackout story (not UPS, though)

        This reminds me of when I was stationed in Korea back in 1970. I was in downtown Kimpo (near Seoul) at about sundown. There was a loud bang and flash on a power pole about 100 feet from me. I looked around and was surprised to see none of the Koreans reacted to the event. Seconds later, a Korean worker comes running around the corner, quickly climbs to the top of the pole, puts in a new fuse, quickly climbs back down and runs around the corner out of sight. The entire event was less than a minute. I can only assume this person was paid for uptime of the section for which he was responsible. Several years ago near my home (in the U.S.), I heard a fuse blow on an early Friday evening. I went to the pole where the fuse blew, jotted down the pole number and called the local electrical company to let them know that a fuse had blown. The "customer service" person said that they had no reports of any outages, ignoring my report. The fuse was only replaced the following Monday, apparently from an industrial customer serviced by that line. The blown fuse was for one phase of a 3-phase line. Not as responsive as the Korean lineman.

        1. Anonymous Coward
          Anonymous Coward

          Re: Partial blackout story (not UPS, though)

          That's an odd way to handle the report. I once heard that the power company call center priority list goes something like: "my lights are out" = lowest priority (not informative, could be a customer problem), "my lights are out and my neighbors are out as well" = higher priority (likely to be a power company problem, larger in scope than just a line down to a single house, "My lights are out and I saw the transformer in the alleyway go kaboom" = highest priority (confirmed outage, and we have a description of where the fault is, instead of having to drive around looking for something burning).

          OTOH: there's a tiny possibility that if the industrial customer was large enough, the fuse was their property and their problem. Hopefully nothing big was running on two phases all weekend...

    4. Arachnoid
      Black Helicopters

      Phasers on stun

      The engines cannae take it Captain

  15. Anonymous Coward
    Anonymous Coward

    Many moons ago I could smell a strong burnt cabbage smell and remarked that it must be nearly lunch time as the canteen was on the floor above and the kitchen just above where I noticed the smell.

    Turned out to be a selenium rectifier burning out in an elderly 50v telecom PSU rather than the dish of the day

  16. Cederic Silver badge

    no bills?

    So who's he leaching 'net access from?

    1. Anonymous Coward
      Anonymous Coward

      Re: no bills?

      Maybe he's like an acquaintance of mine who lives sans Internet - if he needs to do anything online he visits the local library and uses the Internet PC there.

  17. Anonymous Coward
    Anonymous Coward

    If only it had been cartoons depicting lazy American workers

    I spent some time in S Korea at a well known 'Big Company' in the 90's, and as a vendor I was the only one allowed access to the internet in the department I was working with, with strict instructions to be careful. Somehow the staff got my password to the desktop machine I had been assigned and when I got back in one Monday morning it was all too clear what they had been using my account for, a screen saver of a naked lass, and several mozillas pointing at very NSFW web sites. I did wonder why there was no one at their desks, then the howls of laughter. Oh what jolly japes chaps and all that.

    1. Anonymous Coward
      Anonymous Coward

      Re: If only it had been cartoons depicting lazy American workers

      That seems uncharacteristic for South Koreans in my experience. I've found them to be very serious and studious in a work environment, even if they're the life and soul of the party otherwise. First impression inside a South Korean factory was that nobody looked very happy in their work but that impression was wrong...it was just a case of the time & place for laughing and joking was in the cafeteria at break times - on the clock and on the factory floor was time to be serious.

    2. Pascal Monett Silver badge

      "the staff got my password to the desktop machine I had been assigned"

      The first thing I do when I get to a newly assigned machine is change my Windows password.

      Because I have 25 years of experience in jackasses thinking it's funny to use my account to do stuff they wouldn't dare do on theirs.

      1. Anonymous Coward
        Anonymous Coward

        Re: "the staff got my password to the desktop machine I had been assigned"

        Ah, such joys - just look out for that extra USB extension lead plugged in running to another desk with hub/keyboard/mouse on it. Took him 6 weeks to find it (and 3 changes of mouse). We were all fully aware of what happens when you leave your machine unlocked so we had to find our fun elsewhere.

  18. pmb00cs

    Not a UPS, but quite a loud BANG

    Working a Data Centre some years ago as a remote hands and eyes jobby, one of the clients were redesigning their network, and one of their big Cisco switches had a power supply trip, and in doing so it also tripped the circuit breaker. The switch was dual fed, so the other power supply kept things running.

    The facilities team were called about the 32 amp single phase circuit being tripped, and asked to turn it back on. Oddly they rather insisted that something bad must have happened and they wanted the tripped power supply to be replaced before turning the breaker back on. The Client's Cisco certified engineer (CCIE I believe, but may have been CCNP) insisted that this Cisco equipment was top of the line, and could not be the cause of the issue. Their was some management back and forth about who was responsible, and how it should be fixed. After many hours of arguments above my pay grade the facilities team tested that the circuit was wired up correctly, and turned the breaker back on. Then we all, facilities, management, and us went to the data hall to watch the client's engineer turn the power supply back on. 32 amps at 240 volts makes a very loud bang at dead short.

    The replacement power supply arrived within a day or two, and the, now very nervous, engineer watched as we replaced the power supply for him, and under the watchful eye of us, management, and facilities, he very gingerly turned the power supply on again. There was less drama this time, although the client did enjoy the bill for wasting facilities time, and for the increased risk they put the site's power distribution under by not following the previously agreed process for dealing with tripped circuits under their contract, but the precise details of that were also above my pay grade.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not a UPS, but quite a loud BANG

      The manufacturer's engineers commissioned a UK customer's new mainframe. UK production shortages meant it was long overdue - and so a USA model had been imported. The engineers had jury-rigged its 220/110vac auto-transformer for initial commissioning. They then left the customer's electricians to incorporate the transformer into their building's infrastructure overnight.

      Next morning the mainframe was switched on - unfortunately the transformer was now wired the wrong way round.

      .

  19. Anonymous Coward
    Anonymous Coward

    Sooo

    1) The time a small overflow pipe blocked and flooded the battery room for the UPS. Ruined all the batteries. 6 months to get replacements (German manufacture I think, not sure why there was a delay but those were interesting days). The building management system beeped when on direct mains. Constantly.

    2) The time the power tripped. UPS took over, but the generator didn't start. UPS drained *real* quick. Soon we're in total darkness. Generator room was the other side of a completely dark computer room. Behind security doors, that just so happened to lock on a power failure. To top it off, the generator start button was broken and required someone jabbing in a screwdriver just to start the damned thing. Once we got everything back up, the building next door were bitching like crazy because the exhaust stack was level with their top floor window, and guess which way the wind was blowing...

    3) Power failure in an equipment room. No problem - systems are all on UPS. Apart from the rack where someone had installed a UPS but never bothered to plug anything into it. All the PDUs in the rack were running off the mains. All the systems were Oracle Financial systems, so there were a few unhappy people about.

    My other favourite is aircon. Twice I've experienced aircon units failing and filling up the floor void with water, enough to short out the commando sockets under the floor. That makes quite a bang!

    1. Stoneshop

      UPS drained *real* quick.

      Existing DC, getting more and more crucial so it was decided to put a no-break in. 80 to 100 kVA by my estimate, going by the stuff humming inside.

      A shed was built, genset and an UPS with a huge battery bank were installed and wired up. And after some dry runs the Real Test is planned: they'll just whack the Big Red Breaker on the incoming feed. Fair enough, and I don't see many options to perform that test in a meaningful but different way.

      So, with a little bit of trepidation, the head of facilities and the DC manager flip that switch and yes, the UPS takes over without a hitch. Fifteen or so seconds later the diesel starts up: good, good. But then the fun starts. Note that this is 1986, and power electronics that can just twiddle 100kVA at 50Hz to sync with a diesel generator aren't quite there yet, so this setup has an UPS with a fixed output and needs the diesel to sync before the load can be switched over without fireworks and explosions.

      So the diesel's controller starts tweaking the revs, but the damn thing fails to lock sync with the UPS. And of course the UPS batteries are meant to bridge maybe five minutes, during which the diesel would surely have been able to sync, but in this case didn't. And yes, the whole DC went down.

      1. Anonymous Coward
        Anonymous Coward

        Re: UPS drained *real* quick.

        "[...] but the damn thing fails to lock sync [...]"

        Apparently at Bangor University they had a small three phase generator to teach students how to sync to the national grid. One day someone did it wrong. The grid won - and the generator disintegrated

        1. Ribfeast

          Re: UPS drained *real* quick.

          I remember reading a story about something similar, maybe even on here.

          Generator was in the megawatt league, and was flipped on without being in sync with the grid.

          The generator was literally thrown from the building.

        2. Ribfeast

          Re: UPS drained *real* quick.

          This was the one:

          https://hackaday.com/2017/07/05/how-do-they-synchronize-power-stations-with-the-grid/

          A 16MW IC turbine synced incorrectly with a synchroscope. The generator section broke all the bolts holding it to the skid as well as the shaft and all wiring and conduit. The generator continued on its path out of the metal building it was housed in and slammed into another IC turbine in an adjacent building. I and the stupid operator were about 40 feet away at the control panel. Much noise flame shrapnel,smoke and arcing ensued.

          No one was injured and it’s fortunate I didn’t have to run away like the operator did, cause my whole body was locked up watching this debacle. Wished I had a smartphone in 1982.

          This event is rated by me as the third out of six dangerous events I have witnessed working at coal-fired and nuclear power plants. And second most damaging event related to grid syncing. I saw a step up transformer turn to plasma.

          From a distance, the top level of the unit. I didn’t hear the explosion because I was somewhat near the safety relief valves.

          I lost a bunch of dbs of hearing about then. Lots of J/cm. That unit no longer exists not but I bet the marks from my hand holding on to the guard railing were still there when they demoed the plant.

          1. Anonymous Coward
            Anonymous Coward

            Re: UPS drained *real* quick.

            " I saw a step up transformer turn to plasma."

            My mother was of the generation that had seen domestic electricity replace gas - and was most apprehensive of it. As a teenager I had just finished repairing our family tv set - and she was hovering nervously nearby as I plugged it into its mains socket by the window.

            At that very instant there was a loud rumble and a large cloud of smoke appeared over the next street's rooftops. She was convinced I had just caused a transformer to explode in the nearby 132kv national grid distribution yard.

          2. Richard 111

            Re: UPS drained *real* quick.

            Makes me wonder what the top rated dangerous event is but as you have been working at nuclear power plants you might not be able to tell.

  20. Cynic_999

    Cleaning the server room

    How many others have had a server rack brought down by someone plugging a 2kW vacuum cleaner to a (clearly marked) UPS power socket? Since it happened, the UPS has been wired to supply IEC female sockets near the racks rather than standard UK 3-pin mains sockets.

    1. DS999 Silver badge

      Re: Cleaning the server room

      I was in a datacenter (in the US) once that had all equipment with UK plugs. I think...or maybe some EU standard but I think they were UK - they referred to them as "UPS plugs". They ordered it that way, and even had UK plugs in the wall in a little test/deployment lab off in the corner. I was told it was because something Really Bad(tm) happened 20+ years earlier that made them build their new datacenter such that the UPS could only use UK plugs to eliminate the ability of people to plug things into it that don't belong. No one there had been around when whatever it was happened so everyone had a different story of what exactly it was. But it left a deep scar on the IT organization I guess.

      Apparently it is a problem though because even though they order things with the UK plug, sometimes 'helpful' people or automated systems will see an order going to the US and substitute US plugs. They either return the cord (if detachable) and substitute one of the many they keep locked in a special cabinet for just such an occasion, or make the vendor show up and replace the cord (if not detachable)

      1. Anonymous Coward
        Anonymous Coward

        Re: Cleaning the server room

        "[...] that had all equipment with UK plugs."

        Trying to plug our line monitor into power in a computer room we discovered that only the wall sockets were standard UK 3 pin 13A type. We had to buy an extension cable to reach the comms rack. All the false floor 13A sockets had their fat earth pin rotated a few degrees. You had to have a matching plug. There were several models each with their own different degree of rotation - if you really wanted to be awkwardsafe.

    2. Anonymous Coward
      Anonymous Coward

      Re: Cleaning the server room

      "How many others have had a server rack brought down [...]"

      The first generation Deuce computers had mercury delay lines for memory - which had to have a carefully maintained temperature. The units looked like large mushrooms standing on the false floor - with their 24/7 jacket heater cable dangling down to a 13A socket on the floor.

      It took a little while to realise that some errors were due to their floor sockets being convenient for out-of-hours vacuum cleaners.

  21. Anonymous Coward
    Anonymous Coward

    Don't involve facilities if you only have one DC

    I had finally got the budget approved for a UPS for my DC back in the early 90's and was all set up to run the procurement when it was taken out of my hands and passed on the the 'electical engineers' in facilities.

    I had spent 18 months to get to that sage had had been dealing with all the major UPS suppliers. In the end Fac ilites came back with a much smaller spec at 25% higher price.

    I wasn't impressed but was again overruled. Come the installation weekend we had far too much drama. The mans was hard wired across to the UPS then the DC was powered up.

    Things were fine for 5 minutes then it shut down, then restarted, then shut down, 7 times.

    By the time I got back up to the DC and got everything powered down 4 of my 7 Mainframes were off the air with disk failures. We then carried out a sequential load of the machines and finally got everything powered up after 2 1/2 hours.

    It turned out the company whop had supplied the UPS had never done a DC before and had no idea just how big the peak startup load was when the big old disks started to spin.

    During the crisis meeting the following Monday the first thing the head of facilities pointed out was that I couldn't sue anyone as the UPS was the spec facilities had ordered so the vendor was in the clear and I couldn't sue him as he was an internal "business unit" then he told me they were already overspent so the cost of certificate would just be added to my charges.. The UPS couldn't be upgraded and had to be completely replaced. The very very expensive electrical work couldn't be re-used as the power inputs were a the other end of the room. in the end the job I had been quoted £80,000 for cost me over £160,000. This was 30 years ago and my blood pressure still goes up remembering it.

    1. A.P. Veening Silver badge

      Re: Don't involve facilities if you only have one DC

      During the crisis meeting the following Monday the first thing the head of facilities pointed out was that I couldn't sue anyone as the UPS was the spec facilities had ordered so the vendor was in the clear and I couldn't sue him as he was an internal "business unit"

      You could (and should) have reported that head of facilities to the CFO, properly done and with all necessary documentation he would have been out on his arse in nothing flat. And if it happened in the correct jurisdiction, his pension would have gone to pay for the extra costs.

      1. Nunyabiznes

        Re: Don't involve facilities if you only have one DC

        @AP

        Something similar to the OP has happened to us more than once, caused by the same individual, and he's still employed and politically considered advisory to IT. We just recently managed to get moved out from under his direct supervision.

        Too many times the utterly work incompetent fool is a savvy corporate political animal.

        1. A.P. Veening Silver badge

          Re: Don't involve facilities if you only have one DC

          Too many times the utterly work incompetent fool is a savvy corporate political animal.

          Those are prime victims targets for the full B*FH treatment and the same goes for those covering them. And that is before sicking the auditors onto them. Usually there is no need to escalate to a full auto-da-fé but letting them know the full power of the Spanish Inquisition is ready to come down on them does help.

  22. imanidiot Silver badge

    Had a sort of experience like that recently. Loud bang, acrid smoke. In my case it was the wax-paper decoupling cap of an old sewing machine speed regulator pedal going off like a handgrenade. I was glad the thing was contained in a metal housing when it exploded. I could feel it explode under my foot through the metal.

    1. Anonymous Coward
      Anonymous Coward

      Building a power supply for my SW TX 807 PA. Switched it on with a meter across the big metal electrolytic capacitor's terminals. Inexplicably the voltage was rapidly dropping. Then a steaming mixture of paper and foil were explosively discharged up against my bedroom wall. Luckily I hadn't been leaning over it. Yep - I had forgotten to factor in the 1.4 voltage RMS-peak multiplier for the capacitor working voltage.

      Recently I bought a DC voltage up-converter module. Switched it on and fiddled to get the meter connected so the voltage potentiometer could be set. At which point the output electrolytic launched its aluminium container into the air with a bang. The unit could produce more than the stated maximum of 35v - which was also the capacitor's rating. Other suppliers sold the same module using a sensible 50v capacitor.

    2. Stoneshop

      Acrid smoke

      I've learned to keep well away from mid-'70's and earlier Japanese equipment, as the electrolytics they are built with not only go bang when powered up after a long hiatus (as old electrolytics are prone to do anyway), but when doing so emit noxious fumes that play havoc with my sleep for weeks. So first thing with that kind of kit is a long and thorough session with the soldering iron and a tray of fresh caps.

      Back in University we were doing an intro electronics lab; a doddle for people like me for who it had been a decade since building their first radio, but there were others who were utterly new to this. One of that second group was working on the lab bench opposite me, and the circuit we had going involved, for some obscure reason, 150V DC.

      Electrolytics prefer to have their polarity respected. Very much. When violated they tend to protest noisily and noxiously. The student also reacted quite noisily, and I don't doubt he emitted some noxious substances as well

  23. Stevie

    Bah!

    9/10

    Rephrase that as "flames shot out" and you get the extra point for the tote.

    Sadly, no description of leg hair catching fire, or of emergency trouser evacuation (hurhur) but you can't have everything.

    1. This post has been deleted by its author

      1. Anonymous Coward
        Anonymous Coward

        Re: Bah!

        "8.4. Catastrophic Failure Protection - Recommended"

        Phone call: "PC won't power up - and I need to do my expenses"

        "Switch it on - and tell me what lights you can see"

        Pause then "Oh - a big blue flash"

        They had flicked the power switch on/off several times to get that effect. The PSU fuse was a mirror coating inside its glass tube. The cpu fan was the only surviving electrical component out of boards, floppy, disks, DVD.

        1. Stevie

          Re: flicked the power switch on/off several times

          A Catweazle Failure! Haven't seen one of those for a while.

          (Catweazle was an alchemist from the Norman invasion who time-traveled to 1970s UK. Introduced to the miracle of electric light he kept turning it on - "Shine little sun!" - and off until the inevitable "fring!" moment, at which point he sighed and muttered his catchphrase: "Nuthin' works!")

  24. Borg.King
    Mushroom

    Waiting for a Li-ion battery overload

    I'm waiting to see when someone decides their home battery backup system can be wired up to power their cooker.

    A Li-ion battery fire is nothing to be triffled with.

  25. Anonymous Coward
    Anonymous Coward

    "We ran out of available outlets."

    If "number of available outlets" is how you allocate your UPS loads, you will eventually have problems!

    The article skips the step where they resolved the "not enough outlets" problem. I suspect that it involved a consumer grade power strip (complete with handy off switch positioned in the ideal spot to get accidentally hit when positioned behind an untidy rack).

  26. Anonymous Coward
    Anonymous Coward

    "[...] Jeff eventually retired from the world of IT, opting instead for a farming life free from modern fripperies such as telephones or electricity.

    In the early 1970s Hilary Cropper created a pool of part-time home-working programmers. This was the time when many women programmers were leaving the industry to start families - and many were glad of the chance to not to give up their careers completely. For obvious reasons the group acquired the nickname "Hilary's preggie programmers". The remote programming was accessed with low speed dial-up telephone modem connections.

    One exception was a guy who went to help on the family farm in the depths of Wales. He complained about frequently losing his connection after being online for a while. The reason was that his rural exchange was very small and still manually operated. The operator had a habit of disconnecting him "because of strange noises on the line".

  27. Anoymous Coward

    Flaming UPS and noisy alarms

    My UPS memory happened in London in about 2004. My involvement started with a phone call on a Sunday evening, "Get to the office now, the UPS has exploded". Apparently the main battery failed and the UPS switched to the backup battery. The one that had an unknown manufacturing fault that caused it to emit fire instead of leccy.

    My job was to go to our BCO office in Stratford and get the database servers failed over to our New Jersey location and ensure all apps were working. Other people were making things ready for the daytime staff as our normal office in Docklands was unusable By about 3 in the morning we were done, everything was up to date and the business was ready for the morning.

    I then retired to a local hotel to get the thing called sleep. About 5:30 there was a very loud buzzing in the room. I just wanted it to stop so rolled over and tried to get back to sleep. The noise carried on and eventually I decided to track down the cause. In a very sleepy state I found the source, decided I could not stop it and went back to bed, hoping the fire alarm would stop soon. The words fire alarm finally woke up some other part of my brain and I realised that sleep may not be a good idea. I got dressed, and because I had done my normal thing of looking for the fire exit before getting into my room I knew I had to turn left and it was 2 doors down. Just as I walked out the fire exit many floors lower, it stopped. False alarm, everybody ok to go back to their rooms.

    Ignoring a noisy fire alarm just shows you can lead a user to data but you can't make them think. Anyway, it took about a week before we were back in our normal offices, and a while more before the server room was recommissioned.

  28. Damage
    FAIL

    Not UPS but lack of power

    I was involved in moving a team from one side of site to the other. We had a large (for the time) computing set up (the groups internal router alone was larger than the entire site router). Having put a power requirement in for the server room it was returned with a much lower figure - the bean counters said "you only have a hundred users, you don't need that much" assuming some desktop PCs. A couple of emails later I worked out it was going nowhere so I invited them to visit and they were curious so they happily wandered over. A quick tour of the noisy server room (3 floor to ceiling HVAC units flat out because they had not followed our spec on those either) pointing out a large breaker in the corner and they started to agree that more power was probably required, but not fully agreeing with me until I pointed out the recently added substation outside just to feed the kit in the room. I got the needed power.

  29. tcmonkey

    The most dangerous devices in the room

    I rebuilt a UPS following a failure a few years ago. It was one of APC's 5kW switch-mode double conversion jobbies. The one with the 200 odd volt lead acid battery. Very fancy. The machine in question had suffered a dead short circuit in the inverter. The mess was biblical, there was a colossal black scorch mark on the board and all components in the output stage were fried. Most of them had gone open circuit, others had holes blown in them and others still appeared to have been almost completely vaporized, leaving only a nasty smell and some stubs of legs behind. Surprising literally nobody, the battery fuse had not even blown. I dread to think about the amount of energy that must have been delivered by the battery in the couple of hundredths of a second that it took all of this to happen. I wasn't there when it went, but I'm told that the resulting bang could be heard through several walls, over the din of both the computers in the machine room and an office full of call center operators.

    What happened to the unit I hear you ask? As far as I know it's still in service! It was worth spending a couple of hundred quid having it repaired by a tame engineer when a replacement was several thousand.

  30. Jakester

    Space Heaters Create Quite The Show

    I worked at an office where many of the workers liked to keep extra toasty at their workstations. Despite warnings to not plug space heaters, which they were not supposed to have, into the workstation UPS units, there was always one or two instances during the winter season. Makes for a spectacular show and smell.

  31. Anonymous Coward
    Anonymous Coward

    Burning UPS...

    Yup. Smelt plenty of those in my time. Also spent considerable time raunging bloated batteries out of the chassis.

    Getting the bloated batteries out was like yanking a bowling ball through a mail slot.

  32. Anonymous Coward
    Anonymous Coward

    Lunchtime crashing database mystery

    In a certain country with unreliable power supplies our database server crashed every day.

    Lots were drawn and "Mike" flew to the country in question.

    Sure enough, at lunchtime in walks an employee with a microwave cooker to heat up a pastie.

  33. Sam Therapy
    Facepalm

    Fond memories of UPSes

    Working for the former British Coal in Sheffield, at their pensions office, a couple of us stayed late to deal with a bunch of system updates. This was back when Sheffield was having the Supertram network built and, at the time, tracks were being laid up the road from us outside Sheffield Cathedral.

    Any road up, everything was going well until - as we discovered later - some hero chopped through the city's main electricity cables. Everything in our place went dark, then the emergency lighting came on but nowt else did. Some of our security doors immediately locked - all the ones leading to the outside world, that is - and some of them immediately disabled. All the phones had rolled over and died, and back then, mobile phones were scarce. We had no way of contacting the outside world to let 'em know we were stuck in the building and even the roller shutter on our underground car park wouldn't work.

    Eventually, we found a phone that didn't go through the switchboard - it was one of the Directors' phones, naturally - and we called our department head to give him the happy news. An hour or so later, someone turned up with a hand crank to roll up the shutters so we could go home.

    Next day, it transpired our mega expensive UPSes didn't work, had most likely never worked and what's more, had never once been tested. Backsides were comprehensively kicked, but fortunately not ours.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like