back to article Lights go out at Telecity in London data outage

Anyone relying on Telecity for hosting, backup or email services will know this already: a power failure at one of its London centres has left dozens of businesses offline. The company's PRs are currently in a meeting so we have no official response. But several Reg readers emailed us to say power had gone for about an hour …


This topic is closed for new posts.
  1. Dan Harris

    Not all of Telecity

    In fact, not much really ... 2 floors at 6/7 Harbour Exchange apparently.

    Meridian Gate, Sovereign House, 8/9 Harbour Exchange, Bonnington House, Prospect House, Olivers Yard, PowerGate ... now if that whole lot went down :-)

    BTW - I am no apologists for the Telecity Group - I think they are money grabbing so and so's!

  2. Will Godfrey Silver badge

    And it all just disappeared

    ... in a cloud of... Oh

  3. Pete B


    "Dear Customer,

    We are currently experiencing issues with the mains power supplying the 7th floor infrastructure at Harbour Exchange 6/7. Power has not been transferred to the UPS and generator systems due to a fault that is currently being investigated and this has resulted in a loss of power to 7th floor customer equipment at the site."

  4. Myst

    Telecity Powergate

    Telecity Powergate in London is still up!

  5. Robert Carnegie Silver badge

    "The company's PRs are currently in a meeting"

    Exit interview?

    1. Destroy All Monsters Silver badge


      Grogan. Glass lift.

      Enough said.

  6. Harry

    "we have no official response."

    We'd like to tell you what happened, but we're completely in the dark at the moment.

  7. Anonymous Coward

    Service is now restored

    July 6, 2011 at 15:54 Service is now restored

  8. Anonymous Coward
    Paris Hilton



    1. Will Godfrey Silver badge


      Not enough UPS to stop an OOPS

    2. Annihilator Silver badge

      re: OMFG


      One, but it seems to be missing, can you check if it's resting on your caps key?

    3. TeeCee Gold badge

      Re: OMFG

      Don't know, but one of their DOWNS is now publicly visible.....

  9. Anonymous Coward

    OMFG: and how many UPS they got

    An outsourced contract awarded to the minimum cost bidder - to save money ?

  10. Riscyrich

    @ AC15:57

    OMFG you sir, are a moron. They didn't go to GenSet or UPS power becasue there's a fault.

    "Power has not been transferred to the UPS and generator systems due to a fault...."

    Why would you put GenSet or UPS power on to a known faulty circuit and risk fire/damage. Do us all a favour and STFU n00b !

  11. Alan Brown Silver badge

    All the UPS in the world

    Are of no use if the cabling isn't redundant AND redundantly pathed.

    Meantime in other parts of London, trying to get power for data centres is proving "difficult" at best.

    This is nobbling a few HPC projects.

    1. Robert E A Harvey

      trying to get power...

      Combined cycle. Small gas turbine to generate the power, and the exhaust gas heat recovered to heat adjacent office space.

  12. Alan Brown Silver badge

    combined cycle...

    Back to Edison's original plan (selling heat and light).

    More seriously, the gas infrastructure is straining too - and apart from most CC setups being relatively noisy there's still the emissions issue. London's air quality is bad enough as it is.

    My suggestion of setting up the HPC centres where the infrastructure is readily available instead of down the hall from the users (empire builders) didn't go down well.

  13. tardigrade

    Deja Vu all over again.

    There is a wider problem here isn't there.

    I've coloed and rented from a lot of data centers and hosting companies over the years, and without exception all of them have had an "unplanned" outage for an extended period at some point or another. Without exception all of them have also boasted a robust and redundant UPS system and hot fueled diesel generators. Of those whose outage was a power issue where the UPS and generators where supposed to ensure an uninterrupted power supply, without exception they all failed.

    In one instance it was a issue with power blowing out the UPS circuit hence cutting UPS power and disabling the path of power from the generators. In another it was the transformer room having it's floor and walls blown in which severed the physical connection to the generators. A mobile genie was trailered in that was wired into a hastily reconstructed power panel and was then found not to be man enough for the job.

    It seems that whenever a scenario occurs that UPS and generators are called for, other factors impede and these intangibles are never envisaged. There is a mind set here that says, "We've got UPS and generators, so power redundancy is sorted."

    Well it's not is it, and there seems to be a whole area here of power systems design that is in need of far greater scrutiny.

    1. Anonymous Coward
      Anonymous Coward

      You plonker

      Now, what makes you think that you know about every single power event at the data centres you've used? I suspect they've transferred to emergency power many times successfully without you knowing!

      Admittedly I've had similar experience with data centres, but I appreciate that unexpected problems do occur. It does seem to have improved over the years though.

      I probably shouldn't say it, but our current data centre has been 100% since we started with them over 2 years ago.

      1. tardigrade

        You muppet.

        I said, "Of those whose outage was a power issue.." I'm sure that the UPS has kicked in successfully on many occasions. My point is that when there is a power related outage, it's never because the UPS itself has failed or the generator(s) failed to start it's always some other part of the power system that prevents the UPS / genie from doing it's job.

        UPS and genies don't guarantee power redundancy, data centres always seem to be caught out by some other part of the power supply system. I've never had a situation where the hosting company said, "We're off line because the power went and the generator didn't start." It's always, "We're off line because the power went and the generator started but the transformer that regulates the power from the generator blew."

        By the law of averages it seems to be that other areas of power supply are not given the same priority and are therefore not as robust as the UPS and generators.

        It's a bit like the data centre I visited in Leicester, where the bod showing me around boasted about the multiple and diverse fibre providers to the building and multiple fully duplexed routers, who then admitted that they only had one switch. There's your weakest point right there and it makes everything else pointless.

  14. Anonymous Coward
    Anonymous Coward

    And that is why

    you do biannual black starts of a DC.

  15. <shakes head>

    my favorite

    was the 3 week outage when a toilet overflowed and blew the main Bus Bar which was a custom part with no spare. took out 2 floors and I had to pull 1km of cable from the floors below to re power our cabinets

    1. Anonymous Coward
      Thumb Up

      Floors for Giants

      My God! What ceiling heights are people specifying nowadays? I heard of extra for the overhead cabling, and I've seen a data centre where they specified the false floor to be 18 inches rather than 18cm high, but 1km from one floor to the next floor (or two) around the walls. Sir, I congratulate you!

      Anon because of history in this area!

  16. gr0mit

    Who turned out the lights?

    ok, I've had an update from our provider, which can be summarised as " A Telecity engineer switched off the feed to the UPS, but didn't notice until the battieries ran flat."

    Even the UPS in my garage beeps loudly when it loses the mains! Are Telecity really telling us they don't even have a beeper or flashing light to say 'I'm on backup power' ?

    Oh well. Stuff happens.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2020