back to article Divert the power to the shields. 'I'm givin' her all she's got, Captain!'

August is now just a memory, but hey – console yourself that Christmas is just around the corner. Or simply grab a caffeinated Monday beverage and take delight in another's pain courtesy of The Register's regular Who, Me? column. Today's tale comes from a reader we'll call "Jack", who spent the early part of this century …

  1. Dave K

    It's an interesting read, but where is the protagonist's screw-up? Reads more like an "On-Call" than a "Who-Me?" column this week.

    1. A.P. Veening Silver badge

      but where is the protagonist's screw-up?

      He joined volunteered for the military???

      1. Anonymous Coward
        FAIL

        Uh, nowhere does it say he was in the military. It says he was "hired" in fact, so you know he wasn't.

        1. A.P. Veening Silver badge
  2. The Oncoming Scorn Silver badge
    Alert

    Another Place, Another Time

    In the West Country.....somerwhere between Devon & Avon, with Wiltshire & Dorset to the south, somertime between 2000 - 2008.

    It was decided to run an long overdue emergency test of the county councils UPS over the weekend, the first time it failed due to old contaminated fuel that had also had a degree of water content preventing the firing up of the emergency generator.

    Once the source of the water ingress & fresh fuel was supplied on the subsequent redo the generator itself caught fire.

    1. Anonymous Coward
      Anonymous Coward

      Re: Another Place, Another Time

      I remember a time when a county council (not to be named, but bordered Devon, Dorset, Wiltshire and Gloucestershire - Avon hadn’t been invented at that time) rented computer time from the accounts centre of a department store chain. I had a holiday job with the store and the computer in question occupied the whole ground floor of the building - it was quite impressive for the day. No disaster to recount, just a fond memory of my first hand-on encounter with IT :)

      Agree the main story is a good “On Call” but I missed the “who me” aspect.

      1. 0laf Silver badge
        Facepalm

        Re: Another Place, Another Time

        I am aware of one Local Government that in the spirit of greenwash filled it's emergency generator tanks with biodiesel.

        A considerable time later (months, possibly years later) a failover test was scheduled. This went 'orribly wrong. The biodiesel had 'gone-off' and turned to jelly in the tanks.

        The whole lot had to be chucked and the generators repaired. I assume they were filled with non-green derv with some sort of preservative but then again maybe not.

        1. Anonymous Coward
          Anonymous Coward

          Re: The biodiesel had 'gone-off' ...

          That's something which hadn't occurred to me. I just found this link, if anyone finds it interesting:

          http://www.springboardbiodiesel.com/storing-biodiesel-fuel

          Still, if it's a critical backup system, going for the most stable - presumably fossil-based diesel - would seem sensible regardless of a desire to be as "green" as practicable.

          1. Anonymous Coward
            Anonymous Coward

            Re: The biodiesel had 'gone-off' ...

            The problem come in when a paper is sent to a council committee and council policy is changed without consulting the technical managers.

            I was called into the data centre by an operator when the mainframe printers started tearing the sprocket holes out of the paper on page throws. I found out that two pallets of line flow approximately 120 boxes. Had been purchased as a result of a council mandate to move completely to recycled paper. This represented approximately 2 weeks print volume for the data centre or 4 months for the low speed printers in departments.

            I was absolutely furious as we had not been told this was happening, no testing had taken place and I already knew that the large IBM and ICL mainframes could not use the paper, it was both too week to take a page through and so rough it would damage the printers. We had been using sustainable produced paper for several years which was actually more environmentally friendly that the recycled sandpaper currently loaded. I had to send out a couple of ops to scour offices fr a couple of dozen boxes of dent paper to keep us going while another pallet was delivered and we then had to give away the 2 pallets of paper to anyone who could use it. I believe a significant amount went to local play groups for painting on.

            1. phuzz Silver badge

              Re: The biodiesel had 'gone-off' ...

              It probably didn't come from you, but one of my folks brought home a big pile of continuous printer paper for us to draw on as kids, I seem to remember it lasting all of our childhoods.

              I particularly enjoyed tearing off the sprocket strips on the edges.

              1. LeahroyNake

                Re: The biodiesel had 'gone-off' ...

                I give my large format test sheets to after school club. One is an AO sheet with small dots around 5cm apart that they use for making their own 'dot to dot' pics. The other is full of CMYK gradations that they cut up and fold to make eco friendly decorations. Or they just draw or paint on the reverse, everyone is happy apart from staples :)

              2. hypnos

                Re: The biodiesel had 'gone-off' ...

                Seems that everyone I know and worked in an office has a story like this :-)

                I still have the paper somewhere...

            2. MOV r0,r0

              Re: The biodiesel had 'gone-off' ...

              Recycled a bad choice for roller feeds too, it gives out more dust than regular paper which gets into the grain of the pick-up rollers making them smooth and resulting in feed jams. It's so fine that cleaning isn't usually successful - the rollers have either to be replaced or re-ground.

              I've long suspected bleach and china clay use are the real environmental evils of paper production anyway.

              1. Alan Brown Silver badge

                Re: The biodiesel had 'gone-off' ...

                "Recycled a bad choice for roller feeds too"

                Yup. It's an easy way of turning your 5 million print lifespan device into a 1 million print device - which is generally about as ungreen as you can get.

              2. Alan Brown Silver badge

                Re: The biodiesel had 'gone-off' ...

                "I've long suspected bleach and china clay use are the real environmental evils of paper production anyway."

                Bleach - oh hell yes - chlorine based paper bleaching produces mind boggling amounts of dioxins.

                Some - but not all - plants have shifted to oxygen bleaching. It's only effective on softer wood types.

            3. Anonymous Coward
              Anonymous Coward

              Re: The biodiesel had 'gone-off' ...

              Classic council. Like the one that had diesel generators and forgot to put diesel in them when they flicked over for a test. When you have people in a position of power suggesting they be left outside the side of the building with no protection (of course no one is going to fuck with them), idiot.

              1. DButch

                Re: The biodiesel had 'gone-off' ...

                A variant from the 1965 US East Coast blackout - lots of businesses, hospitals, etc. had emergency generators with plenty of good fuel. And electric starter motors wired to mains power... When I got to college I worked in a Power Systems Engineering lab - we were still conducting post-mortems of the blackout 5 years afterwards. (And stirring up new (oh) shit stuff...)

            4. swm

              Re: The biodiesel had 'gone-off' ...

              At Xerox the computer center placed an order for magnetic tapes specifying the brand. Purchasing changed the order to another brand. These tapes would not even survive a rewind without filling the tape heads with a mountain of "crud". It took several demos to convince purchasing that they did not save any money by purchasing inferior tapes.

              1. Alan Brown Silver badge

                Re: The biodiesel had 'gone-off' ...

                "Purchasing changed the order to another brand."

                From "TDK" to "scotch" tape?

                It's happened.

                "We tried to use your substitute, but it keeps gumming up the drives"

                1. BuckeyeB

                  Re: The biodiesel had 'gone-off' ...

                  That's funny. Upvote.

              2. MachDiamond Silver badge

                Re: The biodiesel had 'gone-off' ...

                "It took several demos to convince purchasing that they did not save any money by purchasing inferior tapes."

                The department still got their bonuses for cutting costs that period. New tape drives come out of the capital equipment budget and not the operating budget. Therefore, the letter of the rules for bonuses has been followed.

                The PHB never seems to tack on to incentive programs that whatever is done can't negatively affect the bottom line of the company.

            5. Alan Brown Silver badge

              Re: The biodiesel had 'gone-off' ...

              "it was both too week to take a page through and so rough it would damage the printers."

              I made myself spectacularly unpopular in one location by suggesting that the company invoice the repair/replacement costs of the printers and the replacement cost of the paper to the "green team"' budget - wiping them out for the rest of the financial year and as I put it, preventing them inflicting any more badly thought through carnage on anyone. As it happened the accountant happened to agree with me as she'd been particularly badly affected....

              I fail to understand how absolutely abysmal quality and bloody expensive paper can be justified with "But it's GREEEEN" when a little research would show that the existing stuff is probably 30% recycled already (and paper can only go a couple of recycling cycles at most due to the fibres being chopped too short to be useful)

              Postconsumer waste is so heavily cross-contaminated that any attempts to separate/recycle it generally use 10-100 times as much energy as was used to create it in the first place and apart from metals/glass, trying to recycle into the same things is a losing proposition anyway (why spend a pound and 20p of oil recycling a bottle containing 2p worth of oil when you can just used it as fuel?)

              90% of "recycling" is greenwash, not saving the planet and it tends to happen at the cost of things which can actually make a difference - like using less paper in the first place.

              1. Kiwi
                Pint

                Re: The biodiesel had 'gone-off' ...

                90% of "recycling" is greenwash, not saving the planet and it tends to happen at the cost of things which can actually make a difference - like using less paper in the first place.

                Yay! I love meeting people with smarts!

                That's most of my problem with "green" right there. A lot of things being done are actually more harmful than if things were left alone.

                Cut your use. Recycle/reuse what makes sense, try to limit using what can't be recycled (or what creates more waste in the process). Plant a couple of trees somewhere. Only get rid of hardware when it's efficient to do so (that new computer that saves 1 watt/year in power over your old one took thousands of watts to make, so replacing your old one while it still works is a waste - same for cars!). Insulate your home - if the 'energy cos' of the insulation is less than the energy savings.

                So much "green" stuff does so much harm to the environment. Being a greenie is fine, but grow a brain, do your research, and make sure what you want to do will actually help not harm our planet!

          2. Long John Brass
            Headmaster

            Re: The biodiesel had 'gone-off' ...

            Correct me if I'm wrong... But both Petrol & Diesel can go sour if left in a tank long enough. I've have small petrol motors have issues with the petrol evaporating out of the carbs and "glazing" the venturi

            I've heard that Desil has similar issues even with the issue of water ingress into the tanks (I believe its hygroscopic)

            1. H in The Hague

              Re: The biodiesel had 'gone-off' ...

              "But both Petrol & Diesel can go sour if left in a tank long enough."

              One of the advantages of high spec fuel (2 and 4 stroke petrol, diesel) such as Aspen or Stihl Motomix is that they can be left in a tank for years without going off or absorbing moisture from the air. Primarily used for park and garden equipment but would also be a good choice for other equipment which is only used intermittently. (You could always use conventional fuel while it's running.)

              1. NeilPost

                Re: The biodiesel had 'gone-off' ...

                I’m curious how water can leach into a sealed fuel tank which then feeds a sealed Internal Combustion Engine.

                1. A.P. Veening Silver badge

                  Re: The biodiesel had 'gone-off' ...

                  For one, fuel tanks aren't sealed but have vents to allow air in when fuel is drained from the tank. Air always contains water vapor, which does have a certain habit of condensing in unwanted places.

                2. Kiwi

                  Re: The biodiesel had 'gone-off' ...

                  No idea why you got a downvote for such a question!

                  As AP Veening said, most tanks have some level of ventilation. I know many Californian vehicles have filter systems on the vents so no fuel vapour can escape while still letting the tank breathe when the temperature changes. I don't know how good these filters are at stopping water vapour though.

                  As a tank heats the pressure rises, and some vapour is forced out. As it cools, air is drawn back in (otherwise the tank would collapse) and the air contains water vapour which hydrocarbon seems very efficient at trapping. The moisture is locked in so doesn't escape during the next heating/cooling cycle, and seems to replace the fuel as it goes.

                3. MachDiamond Silver badge

                  Re: The biodiesel had 'gone-off' ...

                  "I’m curious how water can leach into a sealed fuel tank which then feeds a sealed Internal Combustion Engine."

                  If you totally seal the tank, changes in temp will cause the tank to expand and contract. Somewhere down the road a seam will give way. It's also harder to pull fuel from a tank under a partial vacuum.

            2. Trixr

              Re: The biodiesel had 'gone-off' ...

              That's right. Neither of them are happy being left for a long time without a preservative. Although petrol in particular has stabilisers added to it these days. When it oxidises, it starts forming long fatty acid chains, and then once it hits a threshold, it goes into a cascade of reaction (thus the gel/gum). It's definitely exacerbated if the fuel is "wet".

              Antioxidants in the form of phenols or amines prevent the cascade and some stabilisers also contain metal deactivators, because metal ions like copper accelerate the process.

            3. Kiwi

              Re: The biodiesel had 'gone-off' ...

              Correct me if I'm wrong... But both Petrol & Diesel can go sour if left in a tank long enough. I've have small petrol motors have issues with the petrol evaporating out of the carbs and "glazing" the venturi

              Correct. I've seen cars and bikes sit with petrol for 5 or 6 years and run a little rough but start OK. I've seen some left with a 1/2 tank of gas over winter and require considerable work to clean them out and get them running. Best is to drain all the fuel, but the more you have in the tank when you park up the worse the problem can be :)

        2. iron

          Re: Another Place, Another Time

          Biodeisel is only 8% bio so even if it hadn't gone off it still wasn't very green.

          1. Anonymous Coward
            Anonymous Coward

            Re: Biodeisel is only 8% bio

            From the site linked above, and others, it seems that biodiesels vary, and can be made using varying fractions of non-fossil sources. A common proportion quoted seems to be 20%, but your 8% may also be widely used - I suppose it varies according to the supplier and intended usage.

            And, sure, given low bio fractions it's not especially green; but I suppose the point is that at least it's a bit greener than the usual straight fossil diesel. As long as it doesn't go off :-)

            1. Nick Kew

              Re: Biodeisel is only 8% bio

              Biofuel isn't green. At best it just accelerates the fossil fuel cycle (bypasses the fossil on the way from photosynthesis to combustion); at worst it also desertifies land from which it's taken.

              1. Muscleguy

                Re: Biodeisel is only 8% bio

                The EU is causing forest destruction in Indonesia by mandating a minimum bio component in diesel without wondering where it might come from. It is mostly Palm Oil and it kills Orang Utans by destroying their homes.

                Note I'm a Remainer and Scottish Yes campaigner so don't get me wrong. A case of the law of unintended consequences and relying too much on 'market forces'. All those fires in the Amazon (which produces 20% of world oxygen) are market forces as well.

                Cheap chicken and pork, especially from overseas is likely to be fed soya and a lot of the Amazon deforestation is for soya fields, not ranches.

                1. Alan Brown Silver badge

                  Re: Biodeisel is only 8% bio

                  "Cheap chicken and pork, especially from overseas is likely to be fed soya "

                  Which led to massive demand for soybeans, which drove a lot of the deforestation in the Amazon.

                  An outbreak of virulent african swine flu in asia (Look up "Pig Ebola") has caused tens of millions of pigs to be slaughtered and driven a lot of farmers to the wall, meaning the asian market for soybeans has collapsed - meaning that US farmers aren't actually facing Chinese sanctions, but the result of circumstances beyond anyone's control causing a market glut.

                  The orange one turned that into a trade war, demanding that China buy soybeans for pigs that no longer exist. If China really wanted to ignite a trade war it could quietly export the swine flu to Wisconson, Illionois or North Carolina.

                  Meantime Amazonian farmers burn land to plant crops that won't be bought next year for the same reason that it's not being bought this year - the pandemic is still spreading.

                  This is "market forces" at work folks. Coupled with someone who thinks that capitalism only has "winners and losers" instead of benefitting all players when managed properly. The former is mercantilism, not capitalism if one reads one's Adam Smith and pays attention to the social responsibilities of capitalism.

                2. MadonnaC

                  Re: Biodeisel is only 8% bio

                  sorry, it's not 20% of the worlds oxygen, as over 95% comes from the ocean

                  https://www.sciencealert.com/there-s-one-wrong-statistic-everyone-s-sharing-about-the-devastating-amazon-fires

                3. MachDiamond Silver badge

                  Re: Biodeisel is only 8% bio

                  Many of those "fires" in the Amazon were found to be bunk.

                  Plankton and other small life in the sea provide most of the Oxygen. I'm not saying that cutting down rain forests is a good thing, but many old growth areas are fairly stagnant. The high density of what's there has hit a growth limit. This is why using wood for building and tree farming are better than using dead materials. The newly planted trees are growing much faster than the older trees they replace.

                  1. Kiwi
                    Pint

                    Re: Biodeisel is only 8% bio

                    The newly planted trees are growing much faster than the older trees they replace.

                    I've often found some "greenies" who are denser than the wood they complain about. They believe that building a house from wood "creates CO2" and is thus a bad thing. I try to point out to them that 1) The felled trees have stopped growing, 2) the replacement trees will grow faster and thus require more CO2 and the obvious 3), while the house stands the carbon in the wood is locked in, it does not magically and instantly evaporate into the atmosphere, it will be a part of that house for centuries to come if basic maintenance (or 0 maintenance but the right atmosphere - see the many abandoned but intact buildings that are >100 years).

                    Wood is our 'greenest' building material. And if we burn it, and feed the exhaust gasses into greenhouses where more wood is being grown...

                    Live clean, but leave my valuable CO2 alone. Stop burning other rubbish, go zero-plastic, make your house energy efficient, but burn as much wood as you want (the older the better) because the replacement growth fed by your gasses will do so much more than you can imagine to help this planet recover from our lazy foulness.

          2. Mark 85

            Re: Another Place, Another Time

            Biodeisel is only 8% bio so even if it hadn't gone off it still wasn't very green.

            But... all the greenies feel good because it's organic you know. Then again, any petrol product is actually green because it originally came from green things.

            1. Hans 1

              Re: Another Place, Another Time

              Us greenies know about the greenwashing, it is the average Joe who feels good about filling his tank with the stuff ...

              1. A.P. Veening Silver badge

                Re: Another Place, Another Time

                You will get tarred with the same brush until you teach the average Joe better.

        3. Doctor Syntax Silver badge

          Re: Another Place, Another Time

          "The biodiesel had 'gone-off' and turned to jelly in the tanks."

          I still remember a long, circuitous and cold journey from Marylebone to High Wycombe because the non-bio diesel had gelled in the tanks of the BR signalling power supply. Cold weather was quite sufficient.

          1. Muscleguy

            Re: Another Place, Another Time

            I recall back in the '80s in Central Otago NZ there was an inversion with constant overcast and the mercury fell and kept falling. The gritters and graders had to have electric heaters (and other not so safe methods) applied to the diesel tanks to get them to go.

            Oils are like all liquids, they are only liquid at certain temperatures and pressures. Then they become solid. Butter, lard, suet and dripping will all melt to liquid on heating which is the reverse effect.

            You also see it when you light a candle, the wax closest to the exposed wick turns liquid. Snuff the candle and it turns solid again.

            1. Kiwi
              Pint

              Re: Another Place, Another Time

              I recall back in the '80s in Central Otago NZ there was an inversion with constant overcast and the mercury fell and kept falling. The gritters and graders had to have electric heaters (and other not so safe methods) applied to the diesel tanks to get them to go.

              I've seen cars and trucks in the US (IIRC in Minnesnowda) and Canada that had electrical fittings to plug in oil heaters. Seen the same on large ships that run on heavier grades of fuel oil.

              And I've played with a few vintage diesel engines that had a blow torch placed under a bulb to pre-heat the fuel so it'd ignite. Often something akin to a kero heater was used in the olden days :)

              Pre-heating fuel in trucks in cold areas is still pretty common. In summer diesel is not that likely to ignite if you dropped a match into a pool of it. In an Otago winter a blow torch will merely mar the surface a little :) (Or so I'm told)

              1. Anonymous Coward
                Anonymous Coward

                Re: Another Place, Another Time

                In Siberia, where the temps can fall to -50 C and lower, the trucks usually have fires built under them (military type with lots of ground clearance) to keep the fuel from freezing up - needless to say the trucks are built with these conditions, and the traditional mediation methods in mind.

          2. Robert Sneddon

            One very cold winter

            in Scotland a few decades ago, someone my friends hung out with who was notoriously parsimonious had converted his Range Rover to run on diesel with a small coach engine in place of the original Rover V-8. We suspected he was sourcing "red" diesel from some farmer or boatie but...

            Anyhows, the winter came, temperatures dropped and stayed very low for a couple of weeks. The diesel fuel in his Range Rover froze up (red diesel didn't have additives to keep it flowing in low temps, as I recall) and it wouldn't start. He got underneath it with a blowtorch to heat up the fuel pipes. We stood well back and didn't remind him that to save money he had replaced the petrol lines with plastic tubing, instead one of us went to fetch their camera to record the expected conflagration which, sadly, didn't actually happen. He did royally fuck up his fuel pipes though and when the thaw came one not-so-frosty morning he found his Range Rover sitting in a large puddle of diesel in his front driveway.

            1. A.P. Veening Silver badge

              Re: One very cold winter

              Being notoriously parsimonious is one thing, to manage that in Scotland takes it to a whole new level.

    2. This post has been deleted by its author

    3. Alan Brown Silver badge

      Re: Another Place, Another Time

      I recall a civil defence exercise where the radios and half the comms kit didn't work thanks to a power failure on a critical repeater on a hilltop 20 miles away.

      The volunteers (and staff) were all crestfallen and wanted to cancel it - they loved playing with their toys - so you can imaging their horror when the director decided this was a perfect opportunity to test such an equipment failure in case it happened during the real thing.

      There was much swearing and sweating as runners were employed to get messages delivered - and a nice pot of money afterwards to ensure the hilltop kit (which had been sadly neglected for years despite warnings it needed some TLC) got sorted out

  3. Anonymous Coward
    Anonymous Coward

    chillers

    "For some unknown reason, NONE of the chillers were even set up to be powered by the generator,"

    A very frequent mistake so many DC planners do: forget that chillers *ARE* needed also in case of power cut.

    I've seen this so many times ...

    "Even with the huge reduction in powered-on equipment, the temperature still climbed into the 90s (Fahrenheit – which is the 30-somethings in Celsius)"

    Very lucky that it was probable 10-15 years ago when systems generated only a fraction of the heat current systems generate and that probably the ratio systems/space was very low.

    One year ago, we had a chillers power cut in one very small DC we were evacuating (quite rightly so !), with only a handful of racks still powered up.

    In a couple of hours, temps rocketed up to 65 degrees Celcius (not Fahrenheit !) in some places !

    1. Anonymous Coward
      Anonymous Coward

      Re: chillers

      Every single server room significant event I have seen in my career has been due to air con failure.

      Every server room I have worked in I have recommended, at least, dual air conditioning units.

      I rarely recommend a fire suppressant system after doing a significant risk assessment on the likely hood of fire in a server room. As long as there is not other accelerants or significant combustible material then there are far bigger risks to deal with first. Modern servers don't catch fire (they may pop, hiss, have a quick shot of flame but not sustain a fire).

      However with Lithium UPS' now more popular it gets trickier - although I'm doubtful a fire suppression system would cope with a large Lithium UPS explosion/fire.

      1. Dave K

        Re: chillers

        Problem is that even this isn't an infallible solution. We had such an incident back in the mid 2000s at a company I worked at. The server room only had 3 racks and about a dozen pedestal servers, and had dual air-con units.

        Late one afternoon, someone managed to turn one of the air-con units off accidentally. The second air-con unit continued operating, but was an old model and not particularly powerful. During the evening and night, the temperature slowly but steadily increased. Meanwhile, the old air-con unit was running increasingly flat out trying to control the rising temperature until finally it had enough and blew in the early hours of the morning.

        We came in at 8am to sauna-like conditions in the server room due to an "off" air-con unit and a dead one...

        1. Anonymous Coward
          Anonymous Coward

          Re: chillers

          Yes, but the way I would set it up is a linked unit. So both either run together or in a one week on one week off swap so that they are always well lubricated and any issues are reported early on, via the alerting system.

          Either unit has to cope with the full capacity of the server room (hence the week on, week off model is quite good), both on separate power supplies and as limited a single point of failure as possible.

          Also position the air con sensibly - i.e. get the cold air coming in the front of the device and warm air out the back, rather than the other way around.

          Finally, where possible get a couple of big powerful fans ready and the ability to open the door, so as a last resort you can blow the hot air out and cool air in while the PFY sits guard.

        2. Alan Brown Silver badge

          Re: chillers

          "Late one afternoon, someone managed to turn one of the air-con units off accidentally. "

          This is why you have environmental monitoring to tell you that A/C "A" has no airflow and A/C "B" can't keep up.

      2. Killfalcon

        Re: chillers

        I don't know if it was coincidence or good design, but the in-office server room I used to work in had a solid interior partition between the UPS and the servers.

        When one of the batteries decided to undergo rapid unplanned disassembly it was completely contained by the partition. A good 6' of empty air between the two UPS battery banks even kept the fire contained to a single unit. The main expense in the end (after replacing the UPS, not sure if the manufacturer paid for that) was paying to get battery-acid derived soot cleared out of the hardware afterwards.

        1. Robert Sneddon

          Lead acid batteries

          If you use lead-acid batteries in a UPS then you should have vents from the containment to the outside and, belt and braces, a hydrogen gas detector or two located somewhere near the batteries.

          1. Alan Brown Silver badge

            Re: Lead acid batteries

            "If you use lead-acid batteries in a UPS then you should have...."

            ..them in a completely separate fireproof enclosure to everything else and a blowoff panel or two, just in case.

            Apart from the obvious, the not-so-obvious is what happens when they're overcharged and boil, leaving a fine mist of sulfuric acid everwhere - at that point you find ALL the wiring insulation turns black, making any kind of decoding the wiring an "interesting" task (Been there, done that, wrote off £75k worth of kit - not my installation/design, so not my problem - and 200 other near-identical sites got a rapid refit.)

          2. Anonymous Coward
            Anonymous Coward

            Re: Lead acid batteries

            A shall-not-be-named Nuclear Power Plant had exactly that: a room with 2-feet-thick concrete, completely covered in washable tiles with matching drains, connected to the hazardous HVAC system, explosion-proof lights, pressure valves on the interlocked doors. and enough batteries to power a submarine or two.

            But these were not meant for SIMPLE computers, it was powering that kind of computers that you DO NOT want to fail under any circumstances. A couple cooling pumps were also plugged into those...

            And those batteries were meant to last just long enough for the generators to kick in. Which were tested to start in 10 seconds (12 counting the entire alarm system), with pre-heated oil, and a full tank with 200.000 liters of fuel, topped off every week.

            Hardly overkill.

      3. defiler

        Re: chillers

        I'm doubtful a fire suppression system would cope with a large Lithium UPS explosion/fire.

        Now here's where you run into your good old-fashioned Fire Triangle. You need a fuel, and oxidiser and a source of ignition. If the thing has already caught then the flame itself is the ignition source. The battery contains both the fuel and the oxidiser in close proximity, so fire suppressant can't get in there to separate the two.

        That's why I think Lithium UPSs aren't a great idea. Sure, in some respects they're better than lead acid, but having seen some funny-shaped batteries which were swelling (and batteries that have ignited during discharge), I'd rather keep them out of the server room.

        On an associated note, whilst I'm all for electric cars, I do worry what'll happen when the first load have dodgy enough batteries to catch fire whilst charging overnight. Because that fire is going to take a while to stop!

        1. Ogi

          Re: chillers

          > On an associated note, whilst I'm all for electric cars, I do worry what'll happen when the first load have dodgy enough batteries to catch fire whilst charging overnight. Because that fire is going to take a while to stop!

          Its already a thing. Loads of reports about EVs catching fire (usually while charging, but sometimes just randomly). In those cases the fire departments can't do anything except clear the area around the fire and wait for the lithium to burn itself out (i.e. containment). Its a bigger problem if it happens in an enclosed space (like a garage) as lithium fires burn hot, and can destroy the structure around it.

          > That's why I think Lithium UPSs aren't a great idea.

          I don't think they are a great idea either, for the same reasons mentioned for BEVs above.

          Difference is a UPS does not need to be very mobile, so the higher energy storage to weight ratio of Lithium-ion is not a requirement (it is a requirement for BEVs, to make them even barely practical), but you get all the downsides of using Li-Ion, with the added difficulty of trying to contain said fire in a particular room of a building. It would not surprise me to find out that having large lithium batteries may require specific health and safety assessments of the building and room.

          1. John Geek
            Mushroom

            Re: chillers

            first, there's LiFePo4 batteries which are far less likely to go up in a fire. These come in brick sizes up to 200AH per 3.2V (nominal) cell, or even higher. requiring far fewer cells means you need far less equalization circuitry,too.

            2nd, these LiFePo4 batteries can be discharged 80% 2000 times and still have most of their capacity. lead acid batteries lifetime gets greatly shortened if they are discharged below 50%

            3rd, they can be charged at insane rates, like 100 amps into that 200AH cell, linear til its full, so 2 hours to fully recharge. lead acid batteries require an absorption phase to achieve a 100% charge that often takes 6-8 hours.

        2. A.P. Veening Silver badge

          Re: chillers

          On an associated note, whilst I'm all for electric cars, I do worry what'll happen when the first load have dodgy enough batteries to catch fire whilst charging overnight. Because that fire is going to take a while to stop!

          Just ask the nearest fire station about their experiences with electric cars which have been in an accident. They really and truly hate Teslas.

        3. Alan Brown Silver badge

          Re: chillers

          "having seen some funny-shaped batteries which were swelling (and batteries that have ignited during discharge), I'd rather keep them out of the server room."

          repeat after me: "UPS Batteries do not belong in a server room. If you put large batteries in your server room, then you are doing something fundamentally wrong and it will bite you on the arse, or you have a one rack installation and it doesn't matter"

    2. Dinanziame Silver badge
      Devil

      Re: chillers

      Nowadays, data centers are all running near 100 °F. Servers are built to withstand even higher temperatures, but the meatbags servicing the machines would drop like flies.

      1. defiler

        Re: chillers

        Not the two I was in during the last week.

        On the first one the hot aisles were pretty warm indeed, but the second one was quite temperate in most areas I visited. Power room was pretty warm.

        Both sites had nice cool cold aisles, though, isolated from the hot aisles with containment all around. Were a few empty racks with no blanking panels though. Grr...

        1. swm

          Re: chillers

          Back in the '70s computers couldn't stand such high temperatures so the computer rooms were always nice and cool on a hot summer day. I would park my 6-month old son in a baby carriage in the computer room and he slept like a baby.

          1. Admiral Grace Hopper

            Re: chillers

            I would park my 6-month old son in a baby carriage in the computer room and he slept like a baby.

            There might have been occasions during a long shift where I have picked up a clipboard (the universal signifier that I Am Doing Something), headed down to the server room, found a discreet location and settled down for a nap bathed in cooled air and soothing white noise.

    3. Captain Scarlet

      Re: chillers

      Dual Air Con (Where a controller ensures each unit takes turns and only switches both on if a temperatures reaches a threshold) is brilliant until the controller pops a fuse.

      Then they switch off unless you put them into manual where by they work manually.

      1. This post has been deleted by its author

        1. Captain Scarlet

          Re: chillers

          Yes (Sort of, we simply poll snmp temperature readings from anything which has them), but no-one checks them during the weekends (No access to webmail anymore and tbh I bloody hate reading emails on my work phone so will only answer phone calls over the weekend, also outsourced so they should know before me).

          So it was a toasty in there, followed by paniced visit to the site engineers after I found a fuse which had tripped but it didn't mention A/C (I don't like almost electrocuting myself anymore and couldn't see a valid reason why it had tripped so thought it best to get a witness just in case something exploded or melted again).

          Then paniced when nothing happened when fuse reset and a sigh of relief when putting them into Manual mode clicked some form of relay and both units burst into life.

    4. Anonymous Coward
      Anonymous Coward

      Don't forget how large the server room was

      Where in the world do you get the idea that modern systems generate a lot more heat than old ones? The only thing that has gone up is density, per socket the heat output is similar (if not higher in the case of Solaris servers in the past versus x86 servers today) A "server" back then might fill a whole rack, today you often have more than one server per rack unit.

      Anyway if only a fraction of servers were operating, a "[US] football field" sized datacenter (5000+ sq meters) means there's a huge volume of air so it would take a while it heat it up.

      Your server room heated up very quickly because as you say it was very small, meaning a very small volume of air, so it was cycling through the fans a lot more often than if it was football field sized.

    5. Mark 85
      Facepalm

      Re: chillers

      Way back when, we pushed hard for a backup a/c for server room. After about a year or so, the A/C contractor showed.... to install backup A/C in the exec suite.

      1. A.P. Veening Silver badge

        Re: chillers

        to install backup A/C in the exec

        There should have been a small "problem" with the A/C for the server room on the first Friday afternoon after that contractor left (around 17:59), preferably with a bank holiday or something similar on the following Monday.

    6. swm

      Re: chillers

      I remember, in winter, walking out of the company I worked for (on the top floor) with the head of the computer center and noticed that the wall was vibrating. I pointed this out to the head of the computer center and he got very excited. It seems that the water in the chiller on the roof had frozen and the motor was doing the best it could with a non-liquid fluid.

    7. Alan Brown Silver badge

      Re: chillers

      "forget that chillers *ARE* needed also in case of power cut."

      They're not needed on the UPS though. They can go down for a few minutes until the backup generators are running.

  4. anthonyhegedus Silver badge

    I once worked for a company in London, and we had a similar ‘event’. I was the IT manager, having landed that role by default after the entire IT department had been ‘let go’ in a cost-cutting exercise in the early 1990s. We had a relatively tiny data centre with just a few servers and a PBX. One summer we suffered an ‘electricity supply event’ mid-afternoon. The batteries took over immediately and within 20 seconds the generator in the basement kicked in. All seemed fine. The generator actually powered most of the building so truly everything was rosy. For about half an hour. The generator unexpectedly stopped. I was relatively new to the job, and had only been told that the generator had just been serviced and it should be able to run for days with the amount of fuel the fuel tank could store. All this time we were running on batteries, which were never checked or serviced in the last few years due to the cost-cutting exercise.

    So I went into the basement and tried to manually start the thing. It spluttered and stopped. I asked the facilities manager when the thing had last been refuelled and he said ‘never in the last 10 years’. So it was determined that it needed refuelling. We found a company that would send an emergency diesel truck round within an hour. They came, they started refuelling, and 100 litres or so later, the tank was full. I couldn’t understand it. The tank was supposed to last days, so surely that would be more than 100 litres?

    We scratched our heads and determined that given than nobody had any documentation for the generator and that I was a relative newbie to the role, we would cuts our losses and just run the bare minimum of equipment on the batteries as long as possible.

    That turned out to be 8 hours or so, as the power came back well into the evening (it was a 24-hour operation).

    The next day we managed to get some generator service people in. They quickly found the source of the problem. It turned out that there was a relatively small tank that fed the generator. It was fed from a massive tank that could hold several days’ worth of diesel. To get from the large tank to the small tank, there was a pump, that has seized. Due to it having a particularly robust design, there was actually a backup pipe and a backup pump. However, it too had seized. BOTH pumps had sieved presumably due to not really being used.

    This all goes to show that you not only need a plan for what to do in the event of a power outage, but you also need a service plan for the failsafe systems.

    1. Will Godfrey Silver badge
      Unhappy

      It'll never happen

      Servicing schedules are rarely implemented, and even if they are, they're the very first things that get the chop, once the bean counters find out about them.

      It seems quite impossible to get people to realise that that these are not a cost, but a prevention of far greater cost - and that's quite apart from the reputation loss.

      1. Anonymous Coward
        Anonymous Coward

        Re: Servicing schedules ... get the chop, once the bean counters find out about them.

        Presumably beancounters understand the importance of (financial) audits, so perhaps describing it as some sort of "audit" might help? That said, I'm not sure I can come up with a name both sufficiently accurate and compelling...

        1. Doctor Syntax Silver badge

          Re: Servicing schedules ... get the chop, once the bean counters find out about them.

          Business resilience audit?

          Make it business related, not "just" IT. Maybe "resilience" isn't scary enough. Business incident survival audit?

          1. A.P. Veening Silver badge

            Re: Servicing schedules ... get the chop, once the bean counters find out about them.

            The real problem is, that bean counters (and button sorters) don't have the first clue about business continuity and their job security, which happens to depend on before mentioned business continuity.

          2. bpfh
            Thumb Up

            Re: Servicing schedules ... get the chop, once the bean counters find out about them.

            And test it once or twice a year... or just test the beancounter’s servers if you don’t have the current capacity :)

            1. Doctor Syntax Silver badge

              Re: Servicing schedules ... get the chop, once the bean counters find out about them.

              If you don't have the current capacity test everything except the beancounters' servers.

        2. Alan Brown Silver badge

          Re: Servicing schedules ... get the chop, once the bean counters find out about them.

          Beancounters understand it fully when you describe it as "if this fails, your job is toast"

        3. l8gravely

          Re: Servicing schedules ... get the chop, once the bean counters find out about them.

          I always describe is as an "insurance premium" that needs to be paid. We've never had a fire, but they don't mind paying that premium each month because it's a language they understand. So instead of using "audit" just use "insurance". And if they still balk, ask them if they pay for homeowners insurance and have they had a fire at their house? And if not, ask why are they still paying for insurance they don't need?

      2. Tom 7

        Re: It'll never happen

        But the important thing is to have a plan so when the beancounters tell you you cant implement it they then have to take the blame not you,

        1. Anonymous Coward
          Anonymous Coward

          Re: It'll never happen

          Yup

          Over the years, from time to time, I'd put forward a plan for some sort of option (backups etc). My line manager would say "You'll never get that through" and often would even refuse to put it forward. But my reply would be that this was what needed to be done. If the brass opted not to do it we (I) couldn't get the blame if we had the email trail.

      3. Trixr

        Re: It'll never happen

        Of course, there's the instance when the genny tests are done every 6 months like clockwork... but no-one remembers to get it refuelled.

        As happened to a govvie dept who shall not be named in Oz, when their well-tested genny kicked in like clockwork when they had an outage, but promptly died less than 15 min later from fuel starvation. Oops.

    2. Chloe Cresswell Silver badge

      Reminds me of a tale on usenet years ago about a new hospital being build - the doctors had the say in what went first onto the battery system in case of power outages. Idea of course being vital equipment was on this, and everything else would be handled when the big generator powered up.

      IT got power for the systems and yes, even AC was powered.

      Testing the genset worked every time, till they got a real power outage.

      Vital systems went onto UPS, and there was the sound of the genset firing into life.. for a few seconds, then it coughed and died.

      Turns out what wasn't deemed "vital" was the lift pump for the main diesel tank - it was on the backup power system, but not the batteries.

      So it fired up using the diesel in the pipes, and the pump would just come on and start to pump diesel up to the genny.. and it wouldn't get there before the poor thing died from fuel starvation.

      IIRC the fix in the end was a new fuel pipe, which happened to be something like 10x the diameter needed.. and hold enough fuel for the genny to start and new fuel to be pumped up, as they weren't allowed a header tank - but a bigger fuel pipe wasn't a tank!

      1. phuzz Silver badge

        In Nascar 'Smokey' Yunick once found a similar loophole in the regulations about fuel. They specified the size of the fuel tank, but said nothing about the hose, so he fitted a car with an eleven-foot fuel hose, which held an extra five gallons of gasoline.

        (cf)

        1. Anonymous Coward
          Anonymous Coward

          I remember that from "Days of Thunder" when Robert Duvals character talked about adding a line to Tom Cruises chariot that could hold "an extra gallon of gas"

          I see that was in 1968 so predated the film. Fair play to the guy. 2 inch fuel line though, pretty dangerous in a crash, and they like to crash in NASCAR.

    3. Anonymous Coward
      Anonymous Coward

      I thought the punchline would be that people had been siphoning off the diesel into their cars...

      1. EVP
        Pint

        I thought the same! There turned out to be even a handy pump installed on the bigger tank to help when ”loaning” some diesel. No siphoning needed. I was so disappointed to find out that it was a technical problem and the story didn’t end with a line like “...and guess what, the only person in the office driving a diesel powered car was one of the beancounters!”

        Have a one on me —>

  5. Jos V

    Planning works wonders...

    … Unless the local population, with the help of security guards, have syphoned off all the diesel from the generators and sold it on the street..

    The fun you can have in a developing country :-)

    At my new posting, after lessons learned, they finally wired up the building chillers to the Genset, as it was cheaper than replacing hardware that went into meltdown. Just in time for our latest 24hr+ region-wide power outage...

    1. Anonymous Coward
      Anonymous Coward

      Re: Planning works wonders...

      Not just developing countries... we (at a UK site) have a couple of generators, one specificaly as a backup to the main one, that only does our DC. It lives in our car park. We had the local Gypsies turn up in a van about 6 months ago, they had a pump, hoses and a tank in the back of the van. By the time the security guard got to them, they had already broken the fuel cap off the genny.

      Luckily our security guard is a big chap and I don't think they fancied the hassle...We've had to fence the hole area off now.

      1. Alan Brown Silver badge

        Re: Planning works wonders...

        "By the time the security guard got to them, they had already broken the fuel cap off the genny."

        I'd be tempted to use diversionary tactics.

        It might LOOK like the fuel cap for the genny, but it's connected to the sewage line. The real filler's elsewhere.

  6. trolleybus

    Why do they always forget the cooling?

    I was at a Telco in southern China, during the rainy season. We were working on a fairly beefy mainframe that had a short message service centre installed on it.

    The heavens opened at lunchtime, just as we crossed the road to the restaurant. A mighty thunderstorm rolled over, with sheets of water running along the road and constant lightning. Unsurpisingly, the lights went out. We weren't bothered. The customer's a telco, their power supply won't be affected.

    Once the storm subsided a little we swam back across the road. As we expected, the computers were all up and running. What we didn't expect was the cacophony of warning beeps due to overtemperature.

    Yes, the aircon wasn't on the UPS.

    Didn't really matter as the service wasn't live yet, but a timely reminder.

    1. Jos V

      Re: Why do they always forget the cooling?

      Yeah, I've not seen DC where the UPS would drive the AC in the DC (yeah, pun intended), but you hope the gen kicks in soon enough to start driving things after they stabilise/sync up.

      Currently, I give it about 20 minutes before I order a complete graceful shutdown, as at 6 degree below Neptune's wrath, it's never really cold outside.

      From that moment on, humidity becomes an instant issue, as everything condenses up once the AC kicks back in, so you'd have to wait for the center to "dry up" before switching things back on...

      And truth too to heavy rain... You just know you will have a fun day when kids are playing outside and swimming along with your taxi on the way to work.

      One more thing that's a killer by the way, is extreme drought. The grounding poles around the building stop reaching any sort of ground water level, and basically just poke into dried out soil. As in... no more grounding. Don't count on the local power supplier to be of any help.

      1. Anonymous Coward
        Anonymous Coward

        Re: Why do they always forget the cooling?

        Seems like a data center would be a perfect candidate for an Ufer ground. Of course it would have to have been designed to be a data center from day one, as you're designing the grounding to be part of the foundation.

        A simpler solution for already built buildings would seem to be a sprinkler system where the grounding rods are. Use grey water (from sinks etc.) if there's a restriction on using potable water for irrigation. Or heck, there's probably one hell of a condensate drip from the HVAC in that datacenter unless it is in a desert.

      2. Olivier2553

        Re: Why do they always forget the cooling?

        From that moment on, humidity becomes an instant issue, as everything condenses up once the AC kicks back in, so you'd have to wait for the center to "dry up" before switching things back on...

        Start the servers first, allow them to warm up, to dry the air, then only start the A/C. If you maintained the room closed enough during the outage, the air inside should not be that humid either.

      3. Criggie

        Re: Why do they always forget the cooling?

        My old boss was an ex-lineman, and used to tell a story about a farm dog who would bark whenever the phone was about to ring, but only in summer.

        Turned out the mutt was chained up to the ground stake, and in summer the ground was dry and a poor conductor. Dog was some metres away and with water bowls etc, was a better ground conduction path than the earth stake. 75 V ringing voltage through K9-grade conductor induced the barking.

  7. Korev Silver badge
    Mushroom

    The Event

    All the talk of "The Event" reminded me of this.

    1. Pascal Monett Silver badge

      Re: The Event

      That was funny !

      Have an upvote.

    2. Andy Taylor

      Re: The Event

      "The Event" or as it used to be called: Brexit.

      1. Muscleguy

        Re: The Event

        Not applicable to viewers in iScotland. Relief camps for asylum seekers on the Nether Britain side of the border have been established by ScotGov where your application for refugee status will be assessed. Do NOT try to cross the border, the haggis have been trained to attack sassenachs on sight.

  8. Anonymous Coward
    Anonymous Coward

    Trusty UPS's...

    At a previous employer, we replaced the 2U APC Smart UPS devices in about 11 server racks with 3 free-standing APC Smart-UPS VTs - providing about 45 mins run time for all the servers. A diesel generator was also on-site.

    The site was susceptible to brown-outs/power-loss and the idea being if it happened, the backup batteries kicked-in and the generator fired up after about 1 minute of detecting no mains power to site, relieving the batteries of their duty.

    We had our first power issue within a month of the new UPS's being installed...

    The worst sound ever in a server room is complete silence, and that's what met me when I returned to the site after getting an out-of-hours call.

    Power-loss, batteries *didn't* kick in, all servers died.

    Needless to say, the UPS installers were notified straight away - I believe some software path/fix hadn't been applied, or a setting hadn't been configured correctly.

    1. Doctor Syntax Silver badge

      Re: Trusty UPS's...

      Nobody thought to test the installation before going live?

      1. Anonymous Coward
        Anonymous Coward

        Re: Trusty UPS's...

        Well I suspect the UPS installers and AC and the other IT guys more than "thought" about it, but management vetoed it because they didn't want to take the outage. Testing power protection is simple when it works, but when it fails it might take a day or two to get everything back up and running.

        We all know places where getting a few hours of planned downtime for an entire datacenter is almost impossible, the only way you get a day or two is when it is unplanned :)

        1. Doctor Syntax Silver badge

          Re: Trusty UPS's...

          Hence test it before going live.

          1. Anonymous Coward
            Anonymous Coward

            Re: Trusty UPS's...

            The project was REPLACEMENT of existing UPS systems, so the datacenter was already live. You can't load test a UPS without a load, and if you aren't allowed to risk that load going down...

            1. dajames

              Re: Trusty UPS's...

              The project was REPLACEMENT of existing UPS systems, so the datacenter was already live. You can't load test a UPS without a load, and if you aren't allowed to risk that load going down...

              Surely you have a backup datacentre (with its own UPS, and backup UPS) at another site to which all your workload will automatically switch if the primary datacentre goes down.

              You need that anyway, if your work has any importance at all, but without it how DO you test the UPS ...?

              1. Anonymous Coward
                Anonymous Coward

                Re: Trusty UPS's...

                Sure, but then testing your UPS is potentially a test of your datacenter failover which management ALSO hasn't given you time to properly test since the environment went live.

                Some people here seem to be a lot more idealistic than those of us who have "been there, done that" a few too many times. Maybe due to my years of consulting and being exposed to a lot more environments than most I'm too cynical.

                The idealistic ones are either quite young, or they have worked for the same company for many years and it is one of the rare ones who does things right - does regular full DR tests, does regular tests of file/DB backups including full from scratch restoration, acts on all the results of security audits as quickly as possible instead of getting waivers from management for the same list of risks year after year because addressing them is deemed too expensive/time consuming, and all the other lovely things that everyone "should" be doing but a minority of enterprises actually "do".

                1. A.P. Veening Silver badge

                  Re: Trusty UPS's...

                  If it isn't tested, it isn't working properly, so I won't sign off on it and put that in writing. Always nice when the auditors see that one.

                2. Kiwi
                  Pint

                  Re: Trusty UPS's...

                  Some people here seem to be a lot more idealistic than those of us who have "been there, done that" a few too many times.

                  I've often found the best cure for idealism is putting them in a RW environment where they have 6 days to complete a 5.9 day task, and tell them they have to twice do full outage tests as well as full backup/recovery tests in that time - hint that if the system works they'll lose a couple of hours, if it fails they'll at best lose a couple of days. Up to them when to test it, and hint it may not be noticed if one of the tests is missed.

                  The sort of time pressure that can be faced when dealing with business-critical systems and taking things off-line to test works wonders on idealist fantasies!

                  (Ok, was up till 4am banging my head against someone's IT problem (Christmas is paid for!), not at my best example making atm! :) Hope the concept that the 1-2 hours work I suggests 6 months ago would've saved them a packet if they'd implemented it then instead of giving me an urgent phone call during my relax time)

            2. Alan Brown Silver badge

              Re: Trusty UPS's...

              "You can't load test a UPS without a load"

              This is why you have an array of switchable 100W lamps as a dummy load.

              1. [VtS]Alf

                Re: Trusty UPS's...

                Sure... an array of 100W bulbs. We all do that to replace a UPS and test the load. Then again, the fire system gets tested quarterly, but have you ever tested those red flasks filled with inertious gas at 300bar? Just because the detectors and the controller are tested, who knows if all works at the vents when the signal is given from the main controller? Ever did a test on that system to verify if the fire department is actually notified? Ah right, that line was an ISDN line... just being decomissioned by the provider...

                My point is; you _could_ do all that, but when is enough, enough? And you should be able to trust on your supplier and testers and tell you all is ok?

  9. Richard_Sideways

    It's getting hot in here...

    A former employer had a primary server room based out of their original site office near Old St. Lon. Based on the top floor of a an old building, the AC was a mish-mash of consumer office units which had a penchant for icing up and leaking all over the 'artistic pamphlet' photographic studio downstairs. During a particularly warm summer in the early 00's with 2 of the 3 units on the fritz, in order to get the temp back down to below 100 we took the decision to take the windows out of the server room partition to try to ventilate into the empty office which still had operational AC.

    No server ops worth their salt should be without at least some skills as a glazier.

    1. This post has been deleted by its author

    2. Antonius_Prime
      Boffin

      Re: It's getting hot in here...

      "No server ops worth their salt should be without at least some skills as a glazier."

      I believe Adam Savage said it best, when naming his book:

      "Any Tool is a Hammer."

      Related icon. Need at least a degree to properly handle the KVM switch as a makeshift glass breaker... :)

      1. A.P. Veening Silver badge

        Re: It's getting hot in here...

        properly handle the KVM switch as a makeshift glass breaker

        I found I get better results when I use a manager as a makeshift glass breaker. The improvement is usually in management after the event.

  10. chrishansenhome

    Bribery over a new server

    All this reminds me of what General Eisenhower said about the D-Day landings and the preparation for it: "In preparing for battle I have always found that plans are useless, but planning is indispensable."

    1. This post has been deleted by its author

      1. Kubla Cant

        Re: Bribery over a new server

        More pithy still: "everybody has a plan until they get punched in the mouth".

        But it looks like Eisenhower was saying something a bit more subtle. Moltke just says things won't go according to plan, Ike adds that even so, it's essential to have made a plan.

        1. Muscleguy

          Re: Bribery over a new server

          So you have thought about as much as possible and planned for it and the troops, sailors and airmen have trained for it so even if bits of it go pear shaped or no orders come people still know what to do.

          This is where dictatorships get it wrong, nobody is allowed/too scared to make a decision beyond following orders. Stalin had so purged the military prior to the Nazis invading the Russian military just fell over. Stalin was forced into allowing commanders to do what they thought was best without direct orders from him to stem the invasion.

          Even during the war they were taking effective officers out. Solzhenitsyn was an artillery officer in East Prussia when he was arrested for student discussion group activity and sent to the Gulags. The Nazis too dispensed with all sorts of clever, creative people because they were 'wrong' in some way. And even Democracies can get it wrong, look at what Britain did to poor Alan Turing.

          1. Kubla Cant

            Re: Bribery over a new server

            This is where dictatorships get it wrong, nobody is allowed/too scared to make a decision beyond following orders.

            Apparently one of the greatest strengths of the German army during WW2 was its capacity to improvise and adapt. So not all dictatorships get this part wrong.

            1. A.P. Veening Silver badge

              Re: Bribery over a new server

              That worked very well as long as the German army appeared to be winning. Once the tide turned, some upstart corporal thought he knew better than his generals and started micromanaging battles and forbade retreats. The latter caused lots of unnecessary casualties*), causing necessary troops to be unavailable for subsequent battles. This was partially offset by commanders refusing to obey that order ("Vorwärts Kameraden, wir müssen zurück ").

              And putting the head of counter intelligence (Admiral Canaris) into a concentration camp (Flossenburg) didn't really help either.

              It really was a shame Von Stauffenberg didn't succeed in his assassination attempt.

              *)on both sides, but for the point of this argument only the German side is important

  11. This post has been deleted by its author

    1. Pascal Monett Silver badge

      That is quite true, but I believe there is a name for that lack of judgement (not that I remember it right now). The plant was supposed to be protected by the levees that failed, so since it was deemed protected, there was no problem with putting the generators in the basement.

      Of course, if anyone had had the guts to observe that putting electric generators in the first place the water is going to go to was a daft idea, they would probably have been fired.

      So that's where they put them, and Mother Nature then said : "Bad idea".

      The rest, as they say, is history.

      1. Anonymous Coward
        Anonymous Coward

        The plant was supposed to be protected by the levees that failed,

        I recall the levees were of the size require by regulation at the time of construction; it's just that the earthquake/tsunami was bigger than allowed for by those.

        It's probably somewhere in here (I think it's implied on p.9) ...

        https://www-pub.iaea.org/MTCD/Publications/PDF/AdditionalVolumes/P1710/Pub1710-TV2-Web.pdf

        1. Muscleguy

          Re: The plant was supposed to be protected by the levees that failed,

          That is how design and engineering work. You ask the scientists for the projection for the likelihood of earthquakes over 8, typhoons over category 5 over the lifetime of the plant and build accordingly.

          What else can you do? I grew up in NZ which like Japan is prone to earthquake, volcanic eruption, tsunami and ex tropical cyclones (hurricane/typhoon in the South Pacific). The inside back page of the phone book contains the civil defence instructions for your location. When we lived down on the ex salt flats 150m from the beach it was where we had to get to in direct line away from the sea if the warning sirens went off (there was one on a pole at the end of the road). When my sister lived on the lower slopes of the Taranaki volcano they were in the lahar evacuation zone.

          The advice in NZ is if you live in a tsunami exposed place any quake which is long and strong should cause you to bug out for higher ground and you shouldn't wait for a warning. My daughter in NZ and most sensible people have emergency bags packed with tinned/packet food, bottled water, batteries etc. which can be grabbed in an emergency. There was a big fire down the hill from them last year and they were on the verge of having to bug out.

          The buildings are built to not fall down after quake strength X. Note not be usable afterwards, just not fall down. Buildings in the capital, Wellington have been evacuated pending strengthening or demolished after the Kaikoura quake. The govt is engaged in a program to compel/help building owners to bring them up to code as a result of an assessment caused by the Christchurch and Kaikoura quakes.

          Christchurch was bad because the orientation of the waves caused them to reflect back off the Port Hills (volcanic granite) and reflect back causing destructive interference with incoming waves with bigger peaks and troughs than the initial quake generated.

          The Kaikoura quake caused uplift of up to 10m, apparently the noise was horrendous. How do you plan for infrastructure which won't break when the land is uplifted 10m under it? There's a river on the West Coast of the South Island which is extremely wet which has a road bridge which is just a Bailey bridge. Because no engineering can withstand that river in flood, with debris. So they have a bridge which can be replaced easily and keep replacement spans stored safely and locally. It went down recently and was back up inside about a week. One of those huge mining dump trucks was ferrying cars across in the meantime, one at a time.

          1. Kiwi

            Re: The plant was supposed to be protected by the levees that failed,

            The Kaikoura quake caused uplift of up to 10m, apparently the noise was horrendous. How do you plan for infrastructure which won't break when the land is uplifted 10m under it?

            I remember the Kaikoura quake well, later the civil defence sirens going off during the wee hours of the morning, the carloads of people heading away from the low lying areas and parking on my lawn (by invite to make as much room on the street). Hearing civil decense sirens used 'in anger' is not something I wish to repeat!

            I missed the earthquake light (being in a brightly lit house at the time), but there's great footage of it at http://www.youtube.com/watch?v=GZ4JJSrQXqI_2vs done by a local scientist. I was living on the hills in the background of this footage, and we lost power for a good 12 hours or more.

            No buildings fell, but a lot were badly damaged and cleanup/replacement still continues. Oddly, the many old state houses the National government were giving away to their property developer mates for lack of use had taken out of circulation due to being sub-code and "quake risk" seemed to come through unscathed, while many much newer "up to/exceeding code" buildings got a date with the wrecking ball - we're still a bit nervous after the 2nd Christchurch quake and the losses that caused :(

            How do you plan for infrastructure which won't break when the land is uplifted 10m under it?

            If you look at https://www.chchquake.co.nz/ you'll see a photo of a rather kinky bit of railway track (IIRC Edgecumbe but ICBW) - imagine how few buildings could survive that sort of an event to their foundations? (that's actually by a railway bridge - lucky no passenger trains were approaching that during the night!)

            There's footage on YT of the Kaikoura quake that shows a rift running along some hills - somehow a small dam survived (https://www.youtube.com/watch?v=U3H8wlzXGYE - see also the linked "farm track" video near the end of that one!)

            (I grew up in South Taranaki, sorta between Hawera and Patea - the impending eruption of Mt Egmont was the stuff of nightmares for some of us, but so far hasn't happened in my lifetime - but getting shaken by quakes, one less than 100m below the surface near the Kapuni refinery [shudder].... We build stuff in convenient yet stupid places sometimes!)

            1. Muscleguy

              Re: The plant was supposed to be protected by the levees that failed,

              My youngest was up on the rock of Opoho in Dunedin and barely felt it. Here in Dundee we had RNZ online on (they were FANTASTIC) and they reported on people from down in South Dunedin who had self evacuated to friends in Opoho. We used to live down by the beach in Dunedin with sandy soil and water two spade depths down in high summer. We felt a few quakes including one big enough to be duck and cover time.

              To have been in that area while the land heaved and groaned must have been some experience.

              One advantage of old style wooden houses on pile foundations is if a quake moves them off the piles it is fairly straightforward to lift them back up again. When the concrete slab house base of a modern house gets cracked by a quake that house is toast. The idea though is still that nobody dies in a collapsed house and that one held (the guy in the part collapsed homestead had a heart attack). That the home cannot be lived in afterwards is a secondary question.

              Those big solid State houses are not somewhere to be if they do fall down. Especially those with tiled roofs.

              1. Kiwi
                Pint

                Re: The plant was supposed to be protected by the levees that failed,

                That the home cannot be lived in afterwards is a secondary question.

                Many of those "dangerous" places stood for long after the quakes and the thousands of aftershocks (many quite large).

                But there is the risk of other problems, like water ingress leading to further damage. A wooden home can be repaired, but the question becomes at what cost? Is it cheaper to demolish and rebuild, or to go through opening up walls, repairing damage, and closing in again? And can you trust it afterwards?

                Friends of mine live in old state brick houses, or never-state but same design places. They get very nervous in quakes (and will often quickly leave the house to the back yard despite the official advice (years back, not sure if it's the same today) NOT to do so) - a brick-house can fail quite badly whereas a wooden or reinforced concrete house can fail where a wall will collapse but largely stay as a single slab, leaving voids where people can survive.

                I still cannot get over.. My place rode through the Kaikoura quake and many others with no visible damage despite being a horrible place the moves violent if a flea farts in the neighbour's garden. Yet modern buildings only a few KM away were badly damage. The repairs in Lower Hutt are still ongoing (they're just starting to rebuild the picture theatres in Queensgate Mall)

                I used to live in a sandy coastal area myself for a short while. I know what you mean by feeling the ground move with even small quakes! Liquefaction was often a big worry though I never saw it.

        2. Alan Brown Silver badge

          Re: The plant was supposed to be protected by the levees that failed,

          "I recall the levees were of the size require by regulation at the time of construction; "

          _At the time of construction_, GE engineers noted the location of the generators and had a pink fit, demanding that they be relocated to higher ground instead of at the lowest point of the main plant where they were most vulnerable.

          Japanese TEPCO management smiled, patted the overanxious Americans on the heads and promised they'd so so straightaway....

    2. Arthur the cat Silver badge
      Facepalm

      Re: "power was naturally backed up by a generator that was seated on the roof"

      At least the things were in a (fairly) sensible place. If they'd put the backup generators on the roof, rather than the basement, at Fukushima, they'd have four perfectly serviceable reactors now.

      ISTR that when Hurricane Sandy hit New York there was a data centre in Manhattan that boasted that it would be OK as its backup generators were on the roof. Unfortunately the diesel tanks were in the basement to save money.

      1. Anonymous Coward
        Anonymous Coward

        Re: "power was naturally backed up by a generator that was seated on the roof"

        Why would being submerged be a problem for a diesel tank, assuming when it was filled they "screwed the cap on tight", so to speak? If it needs to admit air (to replace the volume of diesel being pumped out) it just needs a "snorkel" of sorts that goes high enough.

        Sort of like those military vehicles that can ford streams and get in water up to their hood - the fuel tank is clearly submerged, as is the engine, but the engine air inlet and exhaust snorkels are above the waterline.

        1. Richard Jones 1
          FAIL

          Re: "power was naturally backed up by a generator that was seated on the roof"

          The tanks are vented so that as fuel is drawn out air, or in that case water can go in. Even if fuel was not being drawn out, water going in can be a bad idea, water is just a bit heavier than air.

          1. A.P. Veening Silver badge

            Re: "power was naturally backed up by a generator that was seated on the roof"

            It seems you failed to read that part about the snorkel on the vent to let air in.

          2. Alan Brown Silver badge

            Re: "power was naturally backed up by a generator that was seated on the roof"

            "water is just a bit heavier than air."

            And a bit heavier than fuel too....

        2. Arthur the cat Silver badge

          Re: "power was naturally backed up by a generator that was seated on the roof"

          it just needs a "snorkel" of sorts

          Needed a snorkel, yes. Had a snorkel, no.

  12. Anonymous Coward
    Anonymous Coward

    In a server room where I used to work, we had ceiling mounted aircon units. These units had crappy little condensation pumps which were basically just a motorised cam squeezing a rubber tube. Every 3 months or so, the cam would wear a hole in the tube and the aircon unit would start dripping water into the server racks.

    The "solution" was to hang large buckets below each unit. Every time a pump failed, the bucket below that unit would start to fill up and we had to siphon the water out every couple of days until the engineer came to change the pump.

    1. Will Godfrey Silver badge
      Coat

      Excellent! So you had a failsafe emergency backup and warning system. You should patent it.

  13. jmch Silver badge
    Headmaster

    Measuring standards pedant alert!

    "...server room was about the size of a football (US, not soccer) field..."

    Excellent use of standards specifications there! American football has a standard-size field - 100 * 53.3 yards. Comparing an area to that of an American football field actually makes some sort of sense.

    A football pitch conforming to FIFA rules can be 50-100 yards (45-90 m) wide and 100-130 yards (90-120 m) long. So a (real) football pitch can vary between 4050 and 10800 sqm, and comparing any sort of area to it is nonsense

    1. This post has been deleted by its author

    2. pig

      Re: Measuring standards pedant alert!

      So you would be allowed a square one? 100x100yards?

      That would be interesting.

      1. Rich 11 Silver badge

        Re: Measuring standards pedant alert!

        The game would be much more fun if the pitch had to be triangular.

        1. Anonymous Coward
          Anonymous Coward

          Re: Measuring standards pedant alert!

          Only if there were three teams on the field competing against each other.

      2. jmch Silver badge

        Re: Measuring standards pedant alert!

        "So you would be allowed a square one? "

        No, it specifies "The touchline must be longer than the goal-line". I guess in theory you could have a 101 X 100 yard quasi-square. In practice the ration used is usually around 3:2.

        Also, those are the general rules of the sport. Individual competitions can and do further restrict the sizes, but in any case there is still an allowed range, not a definite size. For example for FIFA international games the length is 100-120 yards and the width 70-80 yards

        http://www.thefa.com/football-rules-governance/lawsandrules/laws/football-11-11/law-1---the-field-of-play

        1. J.G.Harston Silver badge

          Re: Measuring standards pedant alert!

          In Bagkok they manage to create football pitches that aren't even rectangles!

    3. Adam 1

      Re: Measuring standards pedant alert!

      I prefer to remove all ambiguity and just use 256 nanoWales.

    4. batfink

      Re: Measuring standards pedant alert!

      Yes - as a fellow pedant, this has never ceased to amaze me. Standards? We've heard of them...

      Having grown up with Metric, I often have fun with people in the UK who claim to prefer Imperial measures. It's amusing to watch them flounder when asked about area measures. But knowing that soccer fields vary wildly in size, I'm still surprised how often I get the answer of either "the size of a football field" or "half a football field" when I ask someone to tell me how big an acre is.

      Go on - next time you're in t'pub, ask someone "If you had a square piece of land an acre in size, how long would each side be?".

      1. The March Hare

        Re: Measuring standards pedant alert!

        1 acre is 10 chains per side if I recall correctly! :)

        1. Anonymous Coward
          Anonymous Coward

          Re: Measuring standards pedant alert!

          Not quite, but right idea. 1 acre is 10 square chains, or 3.16 chains per side.

          1. batfink

            Re: Measuring standards pedant alert!

            Yep - or a Chain by a Furlong.

      2. Rich 11 Silver badge

        Re: Measuring standards pedant alert!

        About 70 yards. I remember working that out at school -- (1760*1760/640)^0.5 -- because I really had no feel for how large an acre was and couldn't have looked at a patch of land and told you its area. But once you can picture how long a sprint track is, or even imagine yourself taking 25 paces and triple it by eye, you can start to make a reasonable estimation by proportions.

  14. Andytug

    Another bad planning example

    Remote site power failure, going to last a while, so to help out the emergency generators an extra genset was trucked in and put as close to the building as possible.

    With the exhaust right under the intakes for the building ventilation.

    You can guess the rest.....

  15. T. F. M. Reader

    Specifications

    Years ago, while working for the Research division of a company with a big striped blue logo producing big heavy black servers (among other things) I encountered a data sheet or a spec of some sort pertaining to a server system. The document - whatever it was - stated that the system had to remain operational in quite a wide range of ambient temperatures. The upper limit was particularly interesting - quite a bit higher than one can encounter anywhere on Earth and definitely not survivable for any length of time. The Fahrenheit 90ies quoted in the article are nowhere close.

    So I invested quite a bit of effort tracing the origins of the specification, out of sheer curiousity. It was completely unrelated to my work. Eventually, it was explained to me - as a sort of an oral tradition rather than properly documented rationale - that it all stemmed down from theoretical computations of what would happen in a locked unmanned data centre in an Indian jungle if cooling failed on a Friday afternoon and no one would be able to reach the place before Monday.

    I still don't know whether I was sold an urban legend - I've never been in a data centre in Indian jungle.

    1. A.P. Veening Silver badge

      Re: Specifications

      Yours may be the result of an urban legend, but I know for a fact, that a couple of computer manufacturers changed their specifications for height (after testing) to be able to sell those computers to Schiphol Airport, which happens to be several meters below sea level.

      1. Herby

        Re: Specifications

        Schiphol Airport, which happens to be several meters below sea level.

        Do the full environmental test, and go to Death Valley. Temperature and elevation in one simple test. Of course it might be more comfortable at night.

    2. julian.smith
      Facepalm

      Re: Specifications

      I understand that the roof of the main railway station in Kuala Lumpur has to be able to bear a foot of snow.

      1. A.P. Veening Silver badge

        Re: Specifications

        Packed snow or fresh fallen?

  16. David Given
    FAIL

    Testing is hard

    ...I'm reminded of a Cautionary Tale they told me during a really good computer risks course at university. It may even be true. This parable goes like this:

    A massive data centre, of the too-important-to-go-down, millions-of-dollars-and-hour financial loss if it went down kind, had a set of redundant backup power supplies. Battery, diesel, that sort of thing. They did regular tests, their failover all worked, everyone was happy.

    Then someone dug up the power cable with a JCB and the power went out for real. Clunk, the batteries kicked in; clunk, the generators started up; all as per testing, and everyone was happy... for three minutes. Then the generators stopped. Fruitless panic ensued for a few minutes until the batteries ran out and the datacentre settled into the peaceful silence of death.

    Turned out that while the datacentre was powered from the generators, the fuel pump for the generators was wired to the real mains, and of course the tests never picked this up because they never cut the power for real (the datacentre being too important to risk bringing down).

    There were two morals to this story:

    - if you want to check that your datacentre keeps running even if someone digs up your power line with a JCB, then the only way to do this is to dig up the power line with a JCB.

    - Everything fails. Plan for it.

    1. Terry 6 Silver badge

      Re: Testing is hard

      This fits in with a few of the other stories and my own experiences ( non-dramatic I'm glad to say).

      Underlying all of these is planning that only extends to a current layer or so..

      We have a resource, we have a risk to the resource, so we plan for that risk.

      But then they stop. No one risk assesses the resource that is being used to protect the essential resource right back to the point of independence.

      With luck they'll check that there are systems in place to make sure the backup is in a safe place. But no one checks to make sure that the location that holds the backups is safe.

      As in "Yes we keep copies of all the records ( in the days of paper) in the basement next door"

      But when there was a big flood ( It's a very, very long time ago now when I came across this, but I think it was an enourmous water tank in the roof burst or something) the water got right down to both cellars - including the ones with the spare sets. If my memory serves me right the main copies were in better condition (slightly damp) than the spares.

    2. Peter Galbavy

      Re: Testing is hard

      For those older and with long memories, Demon Internet used to run out of Hendon Lane - a converted church hall type office. We eventually had a large UPS and noise insulated genny installed, after much gnashin and wailing of teeth (and Cliff was an accountant so he was both bean counter and MD). After a very very long all-hand weekend getting the downtime sequence done, wiring moved along with new component issues (three phase versus single phase contactors or something...) it was all back up and running at some point.

      "So, it's all working and the batteries up to charge?" asked Cliff.

      "Yep!" said the proud contractor

      "OK then" says Cliff and proceeds to turn off the grid feed.

      .... waiting, waiting ...

      It worked, thankfully.

  17. Anonymous Coward
    Anonymous Coward

    Access helps

    Gotta be anonymous this time...

    Had a good one at my office just last week - power went out but the UPS kept the comms room going, but not air con. Not normally a big deal as we will go in and shut down all the test/dev gear and leave the production network kit up and it runs a little warm but no problem. This time though the access control system's backup power batteries decided they didn't want to play ball. The maglocks were nicely energised, but the card readers weren't! The room got pretty warm in no time at all and no one could get in...

    Another one from years ago where I used to work had a power down and planned to run off generator for the weekend but when the moment came it failed to run - someone had stolen the diesel from the generator the night before!

    1. Olivier2553

      Re: Access helps

      The maglocks were nicely energised, but the card readers weren't!

      That's why on the cabling to energize the maglocks, I planned a keyed switch. No need to have multiple copies of the key, keep it in a safe place outside of the locked area and you can kill the electronic security at any time.

  18. iron
    Facepalm

    Last year the board at my current employer, a charity, decided in their infinite wisdom that we can't have any downtime. At all.

    So ops bought enough batteries to power our server room for 8 hours at a cost of tens of thousands of pounds. I'm told we can't have a generator due to proximity to a government office.

    A few weeks after they were installed there was a power cut late in the day. My boss and everyone from ops had already gone home so I ended up telling senior management that services should still be up but I couldn't check because like them my PC was off and as a dev I have no access to the server room. Only for us to hear all the fans stop... 10 minutes after the power cut.

    Apparently the cooling for the server room uses the same AC as the rest of the office and no one thought to provide backup power for it. So 10 minutes into the power cut everything overheated and shut down.

    Since then there have been several meetings and discussions and we were told something MUST be done but they aren't willing to actually fix the problem properly so nothing has been done and our DR policy states the servers have 8 hours battery backup. But, we know they will shutdown after 10 minutes.

    1. vulture65537

      Have you written your cloudy serverless proposal?

  19. Blofeld's Cat
    Flame

    Hmm ...

    I worked on a site with a pair of automatic standby generators where the associated fuel tanks were located quite a distance away.

    Some years earlier a safety audit identified potential leaks from these fuel pipes as a "severe" hazard, and resulted in the tank outlet valves being padlocked shut "when not in use".

    Fortunately this modification was discovered during a routine generator service, when the engineer asked for the keys.

    I gather the Production Director had a few brief words (see icon) with the people responsible, and a different solution was put in place.

  20. Anonymous South African Coward Bronze badge

    Talking about aircons and the such reminded me to put an order in for our annual aircon check, as summer is coming.

    Better be proactive and all that before something goes pear-shaped.

    Still need a second aircon as a backup. Manglement knows about that though.

    1. Kiwi
      Pint

      Still need a second aircon as a backup. Manglement knows about that though.

      There's a BOFH tutorial available for that problem.

      (Sorry for the off-site link, El Reg's archive link points to a 404 :( )

  21. cutterman

    In South Africa they wouldn't just steal the diesel, they'd steal the tanks too.

    And quite possibly the genset and certainly any copper wiring lying around.

    Mac

  22. J.G.Harston Silver badge

    There used to be US home improvement programmes on one of the UK cables channels, and I used to watch in horror at the US electrical wiring systems. Wires held together with toothpaste tube caps. EVERY SINGLE SOCKET on a radial.

    1. Anonymous Coward
      Anonymous Coward

      As someone who left the UK and settled in the US, eventually buying a house, trust me, that is only the tip of the iceberg. Its a miracle there is anyone over 60 alive in the US given their electrical practices from back in the day.

      - knob and tube wiring? Check

      - aluminium wiring that has a habit of unilaterally dsconnecting from it's junction? Check

      - wires that are joined by twisting together, squeezing on a wire nut and hope? Check

      - feeds for houses where the mains comes in about 30 feet off the ground in a sort of lamppost affair and just casually drops into the mains panel? Check.

      And that's before the plumbing...

      1. Anonymous Coward
        Anonymous Coward

        Yep, my cousins in Sacremento have mains electricity that comes from a pole and down into the house ( made btw largely of wood).

        And from time to time the cables break and there's sparks.....

    2. Herby

      Toothpaste tube caps...

      These are actually called "wire nuts", and have a small helical sprint thingy inside. They actually work quite well and can handle the required current and are VERY sturdy.

      Yes, we here in the USA (and Canada to some extent) have pretty strict electrical codes to follow, and for the most part it works quite well.

      1. Olivier2553

        Re: Toothpaste tube caps...

        You have codes, but that look s very weak compared to what exists in Europe. At least in France... Like sockets next to a sink. If I recall well, in France, you have a 1 meter envelop around a sink where you cannot have any socket or light fixture.

        And why having ELCB fitted in the wet rooms and not for the entire house? And why having the ELCB reset on the socket itself? If it tripped, it means there is a problem. Do you want to be in the same wet room when you reset the tripped breaker?

        Not to mention when a guy has to install some equipment to generate 3 phase AC for some workshop equipment, not considering that it means more expensive single phase input electricity at higher AMP, more expensive cable to accommodate that high AMP, 20% waste on conversion, etc. And the reply is: no it cannot be upgraded.

        1. wjake
          Happy

          Re: Toothpaste tube caps...

          Because we Yanks are not stupid enough to stand in a flooded room and reset the GFCI breaker, without fixing the problem and cleaning up the room first! It also localizes the outage and allows you to identify the problem.

          That said, my apartment has one circuit for all the GFCI breakers (balcony, bathroom, 3 in kitchen) If it trips for a non GFCI reason, it usually means I plugged in something that takes too much power in the room I'm in at the time. It all works!

          1. Olivier2553

            Re: Toothpaste tube caps...

            But why GFCI only in the wet rooms? You can spill your coffee on your computer in your office too.

  23. Anonymous Coward
    Anonymous Coward

    Perfect system

    All these stories make me wonder if anyone has ever had come across a good system which has stood up to multiple failures really well?

    Where I used to work we had 4 RUPS which ran continuously - we ran 'off' them all the time. We could run off just 2 if pushed and routine tests where the external mains was completely disconnected were carried out and we would run on diesel for several hours just to make sure everything definitely worked.

    It had not worked perfectly from day 1, but the early tests revealed several issues in design, procedures and maintenance routines.

  24. DanceMan

    Re: Wires held together with toothpaste tube caps

    Called "Marrettes" locally. They have a copper spiral inside and when properly installed the wires being joined are wound around each other tightly as well as all being in contact with the copper spiral of the marrette.

    https://en.wikipedia.org/wiki/Twist-on_wire_connector

  25. Anonymous Coward
    Anonymous Coward

    Sweet memories

    Oh I had some 'fun' in my younger days.

    The 'initiation' test of any new Operator was to get locked in the generator room for an hour while generator tests were run. You had to be quick on your feet before the Senior Op slammed the door on you. Big old Marine Diesel engine. We even used to have to dip the tank with a dip stick to see how much fuel was in it.

    Another company, we had a generator room on the other side of an access controlled computer room. Great but for a few things.

    1) When the power failed, the generator failed to kick in.

    2) The locks stayed 'locked', so we had to break the doors down to get to the generator.

    3) In the dark, we had to locate a torch, find the generator switch, then jam a screw driver in it to turn it on (the switch was broken and waiting for a replacement).

    Oh, and then there was the time a pipe froze in the generator room and flooded it. Screwed the batteries. Couldn't get replacements for months.

    Oh, such fond memories <wipes tear from eye>.

    1. ICPurvis47
      Holmes

      Re: Sweet memories

      In my early days as a draughtsman for a turbine manufacturer, I was located in a drawing office on a mezzanine above the Development department shop floor. There was a huge twelve cylinder diesel engine in an adjacent lean-to in order to provide electrical power during the winter, because the site was very expensive to run from the incomer, which probably needed upgrading anyway. The toilets were on the mezzanine too, right up against the party wall for the lean-to, so if you went into a cubicle when the generator was running, the whole place was shaking and vibrating as if you were on a ship at sea.

  26. Long John Brass
    Flame

    Water cooled

    Remember a story from the Auckland power crisis in the 90's...

    Shiny new building, Shiny new diesel generator capable of running the whole building for days! Only problem was this was a fancy new water cooled unit, and the water pump was driven from.... You guessed it, the RAW mains. Cue major city wide power cut, Cue one melted fancy shiny new diesel generator.

    Our ancient, wheezing, clunky, broken, with a couple of shot big-end bearings, air cooled generator; trucked on, and on, and on :)

  27. Grinning Bandicoot

    Another place, another time redux

    Sometimes in the 1990s in a area commonly called Palm Springs the local TV broadcast site is located within sight of the city center. The broadcast site was served by a single power line with very good generator back up. The equipment on the hill could report remotely all varieties of alarms [alarms not values]. Then the DAY off-air and the usual calm well measured moves by the suits [the ones that can read a balance sheet but a schematic is a arcane mystery]. Checking the alarm board: transmitter no alarms, generator no alarms conclusion no problem on Edom Hill and off to annoy the techs that operated the Studio-Transmitter Link. After a bit of this management somebody goes to the site and finds the generator out of fuel with a dead battery.

    In order to save money a common voice grade line was rented consequently no voltage, amp, or power for any of the equipment was available at the studio or business office nor fuel levels. The generator had an automated weekly run test built in but that time was not a fault. The service contract for the generator was on-call, not scheduled because it was the cheapest. The generator tests used the fuel so when this well engineered system worked well in the hardware end, the gellyware in the form of the suits faulted. Fuel used is tests is not like calories when cooking as not counting

  28. DHitchenUK

    Money No Problem

    I worked for a company where the computers needed to be on, 24x7, and to make sure, it had duel backups, twin generators, battery backups, testing monthly. And it worked, in a bad winter the power failed, everything stayed up and running, all was good. 2 months of bad weather, several power failures, all good. Then the snow melted, and flooded the generator and battery rooms, boom !

    So they moved the generators and batteries to the second floor, computers on the first floor. It could not go wrong again. No one actually checked if the structure of the building could take all that weight and after 3 months, the 2nd floor collapsed, crushing the servers and destroying the data center. No one was injured, but there are lessons to learn, even if you build it in the middle of a desert, with ten backups, a frikken satellite will land on it.

    Plan for failure !

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like