Don't worry. It's all fine just ask Lewis Page.
Search engine can find the VPN that NUCLEAR PLANT boss DIDN'T KNOW was there - report
The nuclear industry is ignorant of its cybersecurity shortcomings, claimed a report released today, and despite understanding the consequences of an interruption to power generation and the related issues, cyber efforts to prevent such incidents are lacking. The report adds that search engines can "readily identify critical …
COMMENTS
-
-
-
Monday 5th October 2015 15:23 GMT djack
Re: Having Trident will keep us safe from attack
But Jezzer's* response would be to invite them round for a cup of tea whilst apologising for causing them the trouble of having to blow up the reactor.
I know which one of the two is the bigger deterrent.
* I totally agree with him on most of his policies but his attitudes towards defence scare the hell out of me.
-
This post has been deleted by its author
-
-
-
-
Monday 5th October 2015 11:15 GMT Destroy All Monsters
Why are industrial control systems designed by babes in the woods?
Really. Code-signing with all the side-dishes should be at the TAKEN-FOR-GRANTED INFRASTRUCTURE level by now.
Conservatism?
Replace the control computers by BlackBerrys?
I recently came across a Simatec WinCC control system used in a "in case of problems, SHWILLHTF situation" not nuclear but "interesting infrastructure" ... it was bad. Internet Explorer 6 (though going through proxy out to the Internets), unpatched Win 2008R2, admin login, the works...
(Also, this article needs another stock image. We are not talking nuclear tests. Something from that "China Syndrome" movie, maybe?)
-
Monday 5th October 2015 11:35 GMT Whitter
Re: Why are industrial control systems designed by babes in the woods?
"...Code-signing with all the side-dishes should be at the TAKEN-FOR-GRANTED INFRASTRUCTURE level by now..."
"...As most industrial control systems at nuclear facilities were developed in the 1960s and 1970s..."
"by now" is indeed true: "by then" alas is not.
-
Monday 5th October 2015 13:13 GMT Naselus
Re: Why are industrial control systems designed by babes in the woods?
""by now" is indeed true: "by then" alas is not."
It does go some way to highlight the incredibly low priority hardware upgrades get in around nukes - both reactors and missiles. There was that silo in the mid-west where the (3 foot thick) security door had been propped open with a brick for the last ten years and they were still using 8" floppy disks because that's what was current when the computer system was installed... the US government is shockingly cavalier with these things.
-
Monday 5th October 2015 15:31 GMT djack
Re: Why are industrial control systems designed by babes in the woods?
"It does go some way to highlight the incredibly low priority hardware upgrades get in around nukes"
it is necessarily a lower priority then "it must perform *exactly* to spec". Any change has to undergo a costly and vigorous testing to ensure that, for example, something that previously took, say 2.5ms still takes 2.5ms, no fater no slower.
I was working at a Nuke site when we were migrating the business systems from Novell/Win3.1/Wordperfect to Windows NT Server/NT Workstation/Word. By far the most difficult bit was the word processor. Although the business/admin computer systems did not need to be at spec, new printouts of the site documentation, work orders and such had to look exectly as they did before.
-
-
Wednesday 7th October 2015 09:14 GMT Naselus
Re: Why are industrial control systems designed by babes in the woods?
"Updates to nuclear systems will always be infrequent because they're not the sort of systems where you can just dump the responsibility for finding bugs upon the end-users."
'Infrequent update' vs 'uses systems that are now older than 90% of the people working there' are different things. Amongst other things, several silos have had hardware faults that cannot be repaired properly - hence the security door being propped open with the brick. No-one makes the electronics tat are compatible with the security system anymore. The result? The security is bypassed by users, and so may as well not be there at all.
And this is where we keep the things that can end civilization as we know it. If the lock on my front door stops working, I replace it. If the lock on my nuclear weapons silo stops working, I just stop locking the door...
-
-
-
-
Monday 5th October 2015 14:21 GMT James Micallef
Re: Why are industrial control systems designed by babes in the woods?
"Why are industrial control systems designed by babes in the woods?"
It was well-explained by the article - whoever designed such systems never dreamt that they could be remotely accessed so easily over the Internet*. This is one of the exact points where anti-nuclear protesters have inadvertently made nuclear power so much more dangerous. If upgrading a nuclear plant was not so controversial, maybe we would have modern plants with modern safety systems** instead of 50-year-old reactors based on 60-year-old designs.
* Note that is still an epic fail of not being air-gapped. I mean, the article says "the commercial benefits of internet connectivity mean[s] that nuclear facilities" are increasingly networked.", but I really fail to see what commercial benefits there are to noz having your plant operators on-site. You're really risking the operation of a whole plant to save a few thousand bucks???
**One would hope, at least, that are better than the current ones
-
Monday 5th October 2015 20:44 GMT Anonymous Coward
Re: Why are industrial control systems designed by babes in the woods?
I recently came across a Simatec WinCC control system used in a "in case of problems, SHWILLHTF situation" not nuclear but "interesting infrastructure" ... it was bad. Internet Explorer 6 (though going through proxy out to the Internets), unpatched Win 2008R2, admin login, the works...
Errm, "Simantec"? I thought Siemens made WinCC?
-
Tuesday 6th October 2015 06:45 GMT Destroy All Monsters
Re: Why are industrial control systems designed by babes in the woods?
Close enuff: SIMATIC WinCC
-
Tuesday 6th October 2015 08:00 GMT Anonymous Coward
Re: WinCC
"Simatec WinCC "
Wrong.
"Errm, "Simantec"? "
Also wrong (though Siemens do make WinCC).
"Symantec WinCC"
Unthinkably wrong, though Symantec did do some good initial work around Stuxnet. Langner was better in most respects though.
"Simatic WinCC"
That's more like it.
FFS, how hard can it be, if you want to look credible? I mean it's not like having to speel Vodafone right is it.
-
-
-
Monday 5th October 2015 11:45 GMT Anonymous Coward
Ideal vs Reality
In an ideal security situation the nuclear facilities systems would be locked down and uncrackable, but security measures usually effects the day to day operations and sow a false sense of security.
I'd prefer the guys stood next to the the hot bits to have full and unfettered access for speed and ease.
After all I'd hate the world to end to the words "Whats the soddin' password!"
-
Monday 5th October 2015 13:00 GMT Naselus
Re: Ideal vs Reality
"I'd prefer the guys stood next to the the hot bits to have full and unfettered access for speed and ease."
In a world where you need a 4-digit code to enter the shared laundry room, I think that we can perhaps expect the nuclear reactor control room to at least adopt a similar level of security to the machine you use to clean your y-fronts.
-
-
Monday 5th October 2015 13:17 GMT Red Bren
Can't upvote this enough!
Car maker cheats to meet emmissions tests. Board never asks how they passed the tests as long as the cars sell.
Newspaper hacks voicemail to scoop exclusive celebrity gossip. Board never questions their sources as long as the papers sell.
Banks mis-sell PPI, sub-prime loans, etc and collapse the global economy. Board turned a blind eye as long until the whole scam collapsed.
Its time that executives were held criminally responsible for the wrong doing that happens on their watch. Pleading "I didn't know!" should result in jail-time for corporate negligence.
-
Monday 5th October 2015 13:40 GMT Anonymous Coward
Re: Can't upvote this enough!
Banks mis-sell PPI, sub-prime loans, etc and collapse the global economy.
Would never work without state-supported bubble economics and bailout money.
The fish rots. But economic policies are powerful rot accelerators.
(As a side note the "repeal of tax credits" comes to mind, I dunno. Politicians are even too chicken to call a tax increase a "tax increase")
-
Monday 5th October 2015 18:20 GMT chris 17
Re: Can't upvote this enough!
@Red Bren
Don't forget the engines still pass the stringent US tests. Just like the others VW built the engines to pass the emission tests which it did. The test should have been more robust.
Same applies to this story, governments set regulations & the nuke operators do the bare minimum to pass those regulations, exceeding them, no matter how safer or practical will cost them more money and raises the risk of not passing the regulation now or in the future. For example how do they meet a requirement for a modern monitoring, alerting & management system if the required sensors & reporting systems are air gapped from each other? At some point somewhere in the plant some data needs to cross the net. Nuke plant master melt down alarm needs to alert someone.
-
Tuesday 6th October 2015 03:25 GMT DanielN
Re: Can't upvote this enough!
The cars do not pass the tests. The test is that a company representative provides a test vehicle and signs under of penalty of perjury that the emissions will be the same as when production cars are driven by customers. The test is a legal test, not a technological one. There is no technological test that can detect a sufficiently rigged test article. The US EPA should set up roadblocks at State borders and randomly test cars, but they are not run by competent scientists.
One way data connections are easy to create. For example, install only the outgoing half of a fiber optic cable, and fill the receiver with black glue.
-
Tuesday 6th October 2015 07:54 GMT Anonymous Coward
Re: One way data connections
"One way data connections are easy to create. For example, install only the outgoing half of a fiber optic cable, and fill the receiver with black glue."
How on earth is that supposed to work in the era when everybody assumes the availability of TCP or similar, and TCP assumes the availability of bidrectional signalling)? As does any other networking protocol which wants stuff to be acknowledged after the receiver has dealt with it?
There are sensible options. For most purposes, that isn't one of them.
-
-
-
Tuesday 6th October 2015 10:15 GMT Roland6
Re: "lack of executive-level awareness"
This is just an attention grabbing headline, that simply reflects the need to get executive-level personnel to ask questions and to be prepared to spend some money...
I would have more concerns if the IT Security experts, employed to address IT security issues were unaware of the security issues such as unauthorised VPNs and/or hadn't implemented security systems to detect and prevent such access.
-
-
Monday 5th October 2015 14:08 GMT I Am Spartacus
Experience at the sharp end
Some time ago, got to be at about 20 years, I was involved in putting in an Energy Management System for the late lamented BNFL. Due to some incompetence of the prime contractor for Sizewell B that was being constructed at the time, we almost scrammed the reactor.
Said top-notch boffin saw that we had established connectivity to the mock-up test rig at Sisewell. We were behind schedule (way be hand schedule) and left the test rig our end running through a whole suite of automated tests overnight. This being the Nuclear industry, they could afford the very best VAX kit in a clustered environment, with a wonderful custom made teak and mahogany desks, but not a lock for the computer room.
Well, the Americal white coated to boffin thought that, as we had a connected pc, he could rip it apart and clone all the other PC's from the hard disk. Which he duly did. And then powered all 8 of them on. All with the same network address.
The protocol was designed to check and double check that the reactor unit was going to do what you told it. So there was multiple challenge/responses, designed to ensure that there were no mistakes.
Boffin issued his first command. "Show Status". Reactor mockup says "I think you asked me to show status". PC 1 says "YES". PCs 2-8 all respond "What? no.". Reactor "Are you sure, you asked me to show status". PC 1 says "Yes, get on with it". PCs 2-8 all respond, "Sorry Squire, not us. You are under attack". Reactor: "Ok, I am under attack SCRAMM"
And that was when I got paged as to why I my companies software had tried to shut down East Anglia.
-
Tuesday 6th October 2015 05:52 GMT waldo kitty
Re: Experience at the sharp end
i gave you an up vote for the story but can't help but to see a flaw in the network if it allowed all those cloned machines on with the same address... there should have been some nasty collisions going on preventing all but one from accessing properly... like one sees today with ip addresses when they get hijacked by another system ;)
-
Tuesday 6th October 2015 08:51 GMT djack
Re: Experience at the sharp end
There is relatively little at the network level to prevent multiple machines having the same IP address. Indeed, it is often advantageous when it comes to clustering.
On Ethernet, it is possible for machines to independantly have the same IP address. Each ARP request will result in multiple replies reaching the requesting host. Which machine the requestor believes has the IP address depends on the order in which the ARP responses are received.
The warnings you refer to are likely to be the host operating system doing a sanity check before trying to use an IP address.
Whilst there is some protection available on modern enterprise grade switches, this is often not enabled.
-
-
Monday 5th October 2015 15:07 GMT Commswonk
I'm puzzled as well
I was also a bit mystified by the statement "the commercial benefits of internet connectivity mean[s] that nuclear facilities" are increasingly networked." but James Micallef beat me to it. (in Re: Why are industrial control systems designed by babes in the woods?) So I'll ask the point directly; what are the commercial benefits?
Could an (the?) underlying problem be that when the internet connectivity was being designed and implemented the possiblity of malicious action being taken was not properly understood? I don't want to trigger an argument about public versus private ownership but there has to be a real possibility that simple commercial pressures meant that the connectivity was to the minimum practicable standard (i.e. the cheapest) rather than one that was properly fit for purpose; public ownership might have been less concerned about cost considerations, assuming of course that the risks were understood.
At the same time I am more than a little horrified by I am Spartacus's point ("Experience at the sharp end") that someone could so easily tamper with a mission critical system. Had the person concerned no idea about the possible consequences of his actions? Use of the word "boffin" might support that thought. At the same time the system clearly had no mechanisms in place to stop unauthorised access to them.
Hanlon's Razor tells us Never attribute to malice that which is adequately explained by stupidity and I suspect that the interfering boffin was a textbook example of that, but it should have served as a warning; we are not told if further security measures were put in place as a result of this meddling.
At the same time Einstein stated Only two things are infinite, the universe and human stupidity, and I'm not sure about the former and this tenet should be enough to ensure that system design tries to take account of not only the fool but the bloody fool as well.
I am tempted to suggest that all this boils down to regulatory failures; if Regulators took a properly tough stance then the risks of system compromise via internet connectivity could and would be greatly reduced, but of course that would require the Regulators to (a) themselves be properly aware of the risks of cyber insecurity, (b) have to appropriate powers available to them to do something about it, and (c) actually know what the nuclear companies are up to in terms of connectivity.
Still leaves the question of "why" though.
-
Monday 5th October 2015 16:25 GMT Alister
Re: I'm puzzled as well
I don't want to trigger an argument about public versus private ownership but there has to be a real possibility that simple commercial pressures meant that the connectivity was to the minimum practicable standard (i.e. the cheapest) rather than one that was properly fit for purpose; public ownership might have been less concerned about cost considerations, assuming of course that the risks were understood.
I think you are missing the point made earlier by James Metcalf, and one that has been increasingly forgotten: When the control systems were built, the idea that anyone would be daft enough to connect them to a network where members of the public could access them was unthinkable - in part because such a network didn't exist, and was (at the time) the merest science-fiction.
So it's not a question of being built down to a price, it's simply a (wholly understandable) failure of imagination.
In exactly the same way, the protocols used for the internet such as TCP/IP, DNS, SMTP were never built with security in mind, simply because nobody considered the possibility that these things could be used maliciously.
-
Monday 5th October 2015 18:49 GMT Commswonk
Re: I'm puzzled as well
Um; I don't think I am missing the earlier point. I would certainly agree that:
When the control systems were built, the idea that anyone would be daft enough to connect them to a network where members of the public could access them was unthinkable - in part because such a network didn't exist, and was (at the time) the merest science-fiction.
So it's not a question of being built down to a price, it's simply a (wholly understandable) failure of imagination.
My point was not that the failure was when the control systems were originally designed; if something was then "unthinkable" then no - one can sensibly be blamed for not thinking it. The failure was later, when someone decided the connection to a public network was a good idea. They were entering the realm of Donald Rumsfeldt's "unknown unknowns" and should have though long and hard think about some of the possible implications; it was at that stage that any penny - pinching occured.
Typical, in a way; "there are benefits to having external connectivity on a public network" without any corresponding "what risks might arise?"
-
Monday 5th October 2015 21:43 GMT Alister
Re: I'm puzzled as well
My point was not that the failure was when the control systems were originally designed; if something was then "unthinkable" then no - one can sensibly be blamed for not thinking it. The failure was later, when someone decided the connection to a public network was a good idea. They were entering the realm of Donald Rumsfeldt's "unknown unknowns" and should have though long and hard think about some of the possible implications; it was at that stage that any penny - pinching occured.
Ah, right, sorry, I misunderstood your point.
I agree completely, that whoever thought connecting such infrastructure to the internet without very strict safeguards was a fool, or just incompetent, or, as you say, working to an unrealistic budget.
Sadly, it's normally a decree from on high, from someone with no understanding of the ramifications, which causes these things to happen.
-
Tuesday 6th October 2015 10:29 GMT Roland6
Re: I'm puzzled as well
>"I agree completely, that whoever thought connecting such infrastructure to the internet without very strict safeguards was a fool or just incompetent"
Suspect that in many cases the Internet connection was originally put in to replace dial-up modem or leased-line engineer access (it is a bit of a drag getting to Sizewell from Reading...), which over the years became more functional and hence got migrated to TCP/IP and then Internet and because it was for the 'engineers' little real attention was paid to it's security by said 'engineers'.
-
-
-
-
-
Monday 5th October 2015 15:25 GMT Anonymous Coward
Jobsworths
In my experience once a system has been implemented, it pretty much remains that way until the person who built it retires / dies.
Other than updates and unavoidable upgrades (when software goes EOL).
Im still a relative youngster in this game and as im getting older and wiser im noticing more and more greying engineers clinging onto their jobs with a death grip. Fair enough, grab on, but for Christs sake don't be scared of us young uns taking your jobs...most of us dont want them.
The point im trying to make is, bad industry practice seems to stem from people being around for too long and nothing being looked at. Just because you've never used your DR system doesnt mean you dont have to worry.
Never mind security!
Think about this, anyone that implemented a DR solution 5 years ago will probably find that the process is dated and rubbish by now...as are the associated security practices...
Nobody had heard of ransomware back then or cryptolocker. In fact if you described the very same to someone five years ago they'd probably laugh at you.
Moral of the story, IT guys should be freelance and you should rotate them every couple of years. Everyone is on their toes then.
Im prepared to be downvoted by all the jobsworths that are frightened by that concept.
-
Monday 5th October 2015 17:05 GMT Anonymous Coward
Re: Jobsworths
That is one point of view. The problem you would encounter with endlessly cycling through freelancers is that there would be a lack of continuity in system delivery and you might well end up with the same problem from a different angle. I.E. "I'm only here for 2 years or so so there is no point in properly understanding how it works."
To look at the problem highlighted in this article - if no one stays for any length of time they are not going to get time to fully understand how the IT systems in a nuclear power station hang together and will be just as unlikely to understand why connecting your reactor to the intertubes so you can manage it from home is a bad idea.
As with all things, a balance is key. Stop systems from fossilizing but also ensure that the people that actually know how it works are on hand to pick up the bits when it falls in a heap.
-
Tuesday 6th October 2015 12:36 GMT Stoneshop
Re: Jobsworths
Think about this, anyone that implemented a DR solution 5 years ago will probably find that the process is dated and rubbish by now...as are the associated security practices...
And trying to get them up to date costs money. Which will cause the Beancounter Department to go apeshit bananas, because "did it add to our profits since the last time you came in here begging?"
-
Monday 5th October 2015 16:35 GMT Allan George Dyer
"risk is probability times consequence"
But a cyber attack is not a random event so "probability" is replaced by two questions:
1. Is there an attacker that wants to do this?
2. Do they know how?
Caroline Baylon has admitted state actors can answer "yes" to 2, so it depends on political will.
Time to open a box of airgaps.
-
Tuesday 6th October 2015 12:27 GMT Stoneshop
Re: "risk is probability times consequence"
1. Is there an attacker that wants to do this?
Yes. Even if it doesn't look like a lucrative target, it will be attacked. To borrow the words from the mountaineer George Mallory: "Because it's there". And infrastructure IS a lucrative target.
2. Do they know how?
Sure. Maybe not now, but that's a moot point.
-
Monday 5th October 2015 19:43 GMT Bucky 2
Shenanigans.
My experience with government bureaucracy would suggest that some supervisor asked for a VPN that would work a certain way. The tech did as he was asked, perhaps making a "note" of any reservations about security (e.g.: "The requested password, 'password,' should be changed at the earliest opportunity.")
The supervisor then proceeded to not only ignore the note, but also never used the VPN, and eventually forgot he demanded one.
...UNTIL it came to light, at which point he firmly denied he knew anything about it.
-
Tuesday 6th October 2015 04:46 GMT mr.K
Physical access
If you have physical access you have access. And even regardless of that I am not quite sure I see it as such a big problem that the only perimeter is the physical perimeter as they say. A problem perhaps, but meh. It is amazing the kind of damage you can do with a solid wrench for instance, a bag of coal powder if you want to make it sophisticated or ordinary explosives if you can't bother. And if you have access to a nuclear facility and are not capable of smuggling these things into it then I do not think you should have access to a nuclear facility.
The lack of airgap is troubling indeed though.
-
Tuesday 6th October 2015 07:54 GMT Anonymous Coward
Re: Physical access, airgaps, computers vs hardwired, etc
Stuxnet famously crossed airgaps. Why do people still talk as though airgaps were a solution to any security problem? They may have a small role to play, or they may not. If they do have a role it is only as a small part of a much bigger picture, which in the case of such things as nuclear reactor control and safety systems (or oil rig safety shutdown systems, or various other automation stuff) really is a *much* bigger picture.
What I mean by that is that for example in the case of a nuclear reactor, best practice would be to have separate systems for normal operation and for safety shutdown. These separate systems would have separate sensors for inputs, separate processing, and separate actuation for outputs.
In a setup like this there's an argument to make that if the systems are computer based the computers should be dissimilar. Reasons for doing this include minimisng the risk of an unforeseen design fault occurring and affecting both control and safety systems at the same time. For many years, the relatively simple safety shutdown systems could be and often were hardwired logic,with predictable behaviour including predictable failure modes, and these days they'd have the benefit of lower risk from network attacks (truly hardwired may mean no risk, but PLC-based brings its own challenges).
That kind of thinking used to be in the multi-country European nuclear regulatory document for software licensing which applied at the time when Olkiluoto and friends were specified. The control and safety systems originally proposed did not meet those regulatory requirements, and consequently there were "issues".
I believe the regulatory policy has been updated since then, as have the proposed control and safety systems. The updated policy document is on the ONR website in the UK [1], though I'm not familiar with the details of the changes, or with the details of what was proposed and is being implemented at Olkiluoto.
Angle grinders are great. Does anybody look twice at a man with dirty safety boots, a dirty tin hat, and an angle grinder, as he walks towards a cable duct full of non-duplicated sensor or actuator or power cables...
[1] http://www.onr.org.uk/software.pdf
-
-
Tuesday 6th October 2015 06:57 GMT Anonymous Coward
I do wonder if sometimes things just need to be taken back to basics. I.E. before building the system deciding what and who needs access (physical and Virtual) to what. If it is an older plant then thinking about do we really need to connect X new system to X old unsupported system when it has been running fine unlinked for 30 years. Now, assuming we come out of that saying yes we do need to link the systems then spend a bit of time thinking about how best to protect both of them (not just new and shiny known system). First step would be trying to reduce risk of infection over a critical system by not having either system connected to the main network and buying in industrial computers or using terminals without accessible ports would be a basic but good start.
-
Tuesday 6th October 2015 08:23 GMT Anonymous Coward
UK nuclear safety systems are NOT control systems
I work in the industry and I'd like to make a key clarification.
Control systems are not safety systems. a lot to the networking etc discussed here is for the control system that helps the plant to operate at its highest availability (aka more profit from reactors). The safety system is separate (nearly always hardwired) and protects the plant from hazardous events. All UK safety systems have to comply with IEC 61513 (the nuclear version of 61508), which requires segregation of differently classed safety systems.
Safety systems in the UK that provide radiological or personnel safety, when classified as SIL2 or above (IEC 61508) will be hardwired, or use limited "smart devices" that do not network. The industry is understandably nervous about computerised/smart equipment.
That said cyber security is a new field for the sector, and not one it has great experience about. The greater fear is corporate/nation state espionage than a nuclear safety event.
-
Tuesday 6th October 2015 11:34 GMT Anonymous Coward
Re: UK nuclear safety systems are NOT control systems
"Control systems are not safety systems"
Sounds good to me (I posted the earlier essay, although my knowledge of current practice is somewhat dated now).
Do you happen to know what went in (as distinct from what was proposed) in Finland?
I was under the impression that the original proposal was rejected because it *did* have a combined control (operational) and safety (shutdown) system (shared sensors, shared processing), which was not permissible under the applicable regulations at that time and is one of many reasons why the project is now so late. Regulations and proposal have, afaik, both changed since then.
All input gratefully received.
"The industry is understandably nervous about computerised/smart equipment."
Very understandably so given what's inside many modern PLCs and such, e.g. "smart" IO devices.
You knew where you stood with something simple like Modbus. Complexity is OK, except when it isn't.
-