Huh?
Wait. What?
So this power station produces about 3.5% of the whole national electricity output? I can understand there might be some localised problems losing this station, but a nationwide blackout?
Kenya yesterday suffered a four-hour nationwide blackout caused by a monkey tripping a transformer at a hydroelectric power plant. 'Leccy producer KenGen said: "At 1129 hours this morning, a monkey climbed on the roof of Gitaru Power Station and dropped onto a transformer tripping it. This caused other machines at the power …
The sudden loss of a large supply means the rest of the grid gets a "shock wave" which means that other lines can become overloaded and protection kicks in. Cascading failure...
Common when there's little supply overcapacity compared to demand... and why the UK imports electricity from France on such a regular basis.
Yup cascading power failures are a real threat. When you combine them with something like an EMP over a major plant in Europe or America, you could basically move an entire continent back to the Dark Ages.
Repair times are not quick either, as transformers get damaged. I think I saw a stat somewhere that said if the whole of Europe went down, the worlds total manufacturing output for 10 years would need to be dedicated to rebuilding transformers and the rest of the burnt out grid.
Securing power plant from Electromagnetic Pulse threats have been serious investigations for the EU, UK and America since 2013...
"Securing power plant from Electromagnetic Pulse threats have been serious investigations for the EU, UK and America since 2013..."
Earlier than that. The risk has been written about for more than 2 decades and seriously investigated by governments since 2003 or so. It turns out that securing against EMP (solar flares are more likely problems) is relatively straightforward but adds cost to the installation(*). Power companies being power companies, they decided the extra spend wasn't worth it (as with all private companies, they won't add redundancy unless it adds to the bottom line or are forced to do it. Brakes on railway carriages come to mine....)
USA regulators have been jumping up and down about this for a while, hence the STEP program, but it took 10 years after the dangers were pointed out before it even got started.
(*) Advocacy groups say hardening the US distribution grid would cost $2billion, industry says $20 billion. When a single airport building (Heathrow T5) or urban rail project (Crossrail) can cost about the same, it's small beer in the overall scheme of things if that amount of money needs to be spent across a nation's entire infrastructure.
Whether its $2 or $20 billion, if that number is even remotely correct, it is a small price to pay for insurance against what would happen if an EMP (natural or otherwise) took out the US grid. If the industry is looking for an excuse to raise my prices a few percent to pay for this, please do!
Really? Where? Cos it sounds like complete and utter cr@p.
It was in an Nation Research Council report titled "Terrorism and the Electric Power Delivery System". The report has been summarised many times by scientific website, all over the interweb.
Maybe have a google and consider the idea that I might not be lying rather than swearing at me and calling me a liar. It's kind rude.
http://www.nap.edu/catalog/12050/terrorism-and-the-electric-power-delivery-system
There's a good background here. on how just a single problem in one part of the system puts so much stress on the surrounding parts that have to take up the load that it swiftly spreads and, well, takes down the electricity grid for millions of people.
The short version is that unless every part of the system can cope with that 3.5% change in load, then candle prices will rise.
You are not familiar with the "smart metering" attack.
It is sufficient to turn 2-4% on/off at once with no prior notice for the grid to start shedding connections. If this is not calculated correctly (or in the case of an attack if the sequence of on/offs is malicious) the whole grid can collapse completely (*).
So all it takes is one monkey - either at a big enough power plant or one high enough in relevant government department. There is little difference between failing to protect the grid feed transformer and deciding to install smart meters with uploadable firmware into every house (and someone backdooring it with a trigger sequence).
The result is all the same.
(*)It is now 20+ years since I last helped my dad with computations in optimal control of grid load "management" so the 2-4% number is off the top of my head.
It's called a "Cascading Failure", and we have had experience with it in North America. All it took was for one tiny little tree branch to contact one phase of a electric transmission line, and a very large portion of the northeastern US and eastern Canada went dark, in some areas for up to two weeks. That one transmission line that experienced the fault went off-line. Incorrect network management software didn't spot that failure, and adjacent transmission lines were overloaded while taking on the load of that first transmission line, which caused them to go off-line. As more and more transmission lines overloaded and tripped off-line, power plants started seeing over/under-load conditions, and would go off-line, including a couple of nuclear plants that had the reactors scrammed. Restarting those was a fairly intensive process. And, all because of one tiny little tree branch.
https://en.wikipedia.org/wiki/Northeast_blackout_of_2003
Dave
All it took was several tree branches, a staff so clueless they took down their only reactive power generatng station for maintenance during the month of peak reactive power demand, a staff so clueless that when told there were shorts being seen relied on (frozen) computreized instrumentation and said "eyewitnesses? Pshaw!", an IT staff so disconnected with the core business that they fluffed the attempt to bring the instrumentation back online until matters were well and truly out of claw and a staff so clueless they didn't factor their offlined power generation facility as a "level one failover" and so were "working the problem" from the wrong scenario in their playbook.
There is a wealth of detail in the official report. Failing to trim trees was just the first of many ways that that Ohio-based "power utility" failed to perform their duties with due dilligance.
But that's what happens when you fire all the old hands and grab yourself a new staff who haven't a clue.
Arguably, the monkey could have done the job better.
All it needs is sufficient, sudden, loss of online capacity to slow the rest of the generators down due to overloading - once you're more than 3 or 4 Hz down on nominal frequency it's all over - a black restart is needed and this can take a long time as the load being connected has to be matched closely to the online generation capacity to avoid 'rinse and repeat' situations.
Of course, if the station you lose is currently the frequency setter for the network ie., it's big enough to drag all the others into line speed wise, then you've suddenly got a whole heap of small generators, under excessive load, with no flywheel effect from the frequency setter to keep them stable.
"To keep out a monkey that can trip out a whole hydro plant AND ESCAPE UNSCATHED?"
I have my suspicions that there were at least 2 monkeys. The one that actually caused the failure was probably instantly cremated with no visible remains and a somewhat surprised witness.
Which begs the question: "Where's the naughty little hole that they use and when will this happen again?"
Aren't these situations typically caused by a short across the animal? At least in my experience where I've heard about squirrels causing outages, there is generally very little left.
Either the monkey simply pulled a lever that basically turned off / tripped the plant - in which case they need to monkey-proof it somehow - or the monkey was incinerated but they chose to show a picture of a different monkey so they could claim he was unhurt.