obvious solution ...
... don't connect your damn hospital's internal system to the damn internet!
Healthcare regulations oblige medical equipment vendors to focus on developing the next generation of technologies rather than addressing current cybersecurity issues, according to experts presenting at the eighth Israel Cyber Week. Ophir Zilbiger, partner and head of the BDO Cybersecurity Center Israel consultancy, said …
Very true, but as always the problem is the same: money and convenience.
Some hospital staff need external internet access, and also internal. But no one will do a red/blue network and separate terminals for air-gapping it, or even a properly thought out system on common networking to have logically separate VLANs, white listed web sites, strongly sandboxed applications, etc, etc, because they already have a running and generally working system and don't want / can't tolerate the disruption of a massive overhaul.
"they...don't want / can't tolerate the disruption of a massive overhaul."
It needs to be presented to them in the form of "You can have the planned disruption of the overhaul now or you can have the more serious, unplanned disruption of something like Wannacry later with the media and public pointing at you and blaming you. Which is it to be?".
you can have the more serious, unplanned disruption of something like Wannacry later
.. which will then be blamed on IT not doing their job properly rather than on the lack of funding to actually produce a proper secured network.
There's always a rouge engineer you can blame. (Other facial paints are available)
Generally this is not a possible solution to a hospital as a whole. It certainly needs to access the national health insurance system(s), prescription systems, vendors and possibly many others.
Maybe some crucial infrastructure may be kept offline for security but than you need to revert back to 20th century technology having to burn DVDs of all the scans etc. and carry them manually to computers of doctors.
The article conflates confidential data on hospital networks with remote access to diagnostic equipment, and these should be separated.
I don't see why the MRI machine needs to networked. Transferring the data from the MRI to the hospital Intranet via sneakernet makes it significantly harder for hackers to gain unauthorized remote access to the machine, and is the work of a few moments. Securing the data on the hospital Intranet is then a different issue that is simplified because it doesn't involve trying to get the MRI scanner to issue security related patches.
Getting the correct patient details onto the scan requires access to the hospital records, relying on manual entry results in getting them wrong. Neither do you want to tie up the machine console with burning multi-gigabyte DVDs (seriously) to get them off the scanner and then have someone have to manually push them onto the system at the other end for each scan. Besides which, this only safeguards the system running the scanner itself, your images (with identifiable information) are now in both places. Not to mention creating a mountain of optical media that needs to be disposed of securely (I used to have a literal stack of such CDs that needed to be kept locked up until we could arrange to have them shredded).
Smooth Newt - are you serious? A huge amount of effort has gone on over the years in networking scanning equipment such as this. They use a standard called DICOM
https://en.wikipedia.org/wiki/DICOM
How would you propose to do this? Writing DICOM studies to removeable media then a radiographer puts the media in another terminal and reads it in to the PACS system?
What a waste of time and sheer drudgery for someone. Hopefully in your scheme the tags for the patient ID etc. are automatically read in.. rekeying anything like that is an invitation to a mixup.
Also what removeable media? I don't know the lifetime of MRI scanners these days I must admit.
When I worked in PET scanning we archived to DAT tapes and gave patients a copy of the scans on Sony Magneto Optical disks. That was many years ago, but I doubt you would be able to get a reader fro these MO disks today.
Smooth Newt - are you serious? A huge amount of effort has gone on over the years in networking scanning equipment such as this. They use a standard called DICOM?
The problem addressed is that the diagnostic equipment is on the Intranet and so is exposed to security risks, possibly via something else on the network getting compromised. Mitigating these risks seems insurmountable if the code cannot be regularly updated. By far the best solution for keeping a system secure is to air gap it. It isn't perfect but it is the best there is.
I am sorry if it is "sheer drudgery" to vastly decrease the likelihood of the devices being compromised, but it is hardly a "waste of time". And it is less tedious than many other activities which take place in hospitals.
How would you propose to do this? Writing DICOM studies to removeable media then a radiographer puts the media in another terminal and reads it in to the PACS system?
Yep, pretty much.
There is, alas, more to DICOM than getting studies from one place to another. DICOM has been in common use since the early 90s & even today is probably one of the most complex standards in common use. It is one of very few standards that is (almost) plug-and-play between vendors globally.
For example, outside of the IT security concerns, a serious (like, very serious) patient risk is mis-identification of data. I.e. one patient's data going to another patient's 'folder'. DICOM has this solved (I think noted in the discussion above) which itself requires a series of handshakes between provider and client.
Putting the data onto media & resorting to airgap is, alas, one thing DICOM has not been great at. Vendor interpretations of the file 'format' (known as DICOM Part 10 - yes the file format is only part 10 of dozens of elements of the standards) has meant such transfer is still quite unreliable.
Yes, early versions of DICOM did have a specification of RS232 transport - but a lot of folk have done a lot of work in the meantime and partying like its 1993 probably isn't a great way forward.
Oh and a lot of these devices (especially but not exclusively outside Radiology) would also use HL7 for data transfer. That's a different kettle of fish.
The closest you could probably get is a set of separate VLANs for medical devices with NAC and a heavily locked down layer2 firewall. Given that WannaCry by all accounts only affected admin functions this may already be the case. However you still have to protect the admin network otherwise patients don't get their ops/scans etc.
It seems like it was the admin net that was the source and the major victim in this case - and that matches the experience when my SO had a serious illness - the medical side was fine, but the admin was so woeful and creaky at the hospital she was diagnosed (to the extent that had to *fax* critical docs between departments on the same site, and managed to lose her entire case history) that we demanded she was moved to another (UCH) which was vastly better.
NHS has amazing staff and medical expertise but the inconsistency of admin procedures, tools and more importantly investment across the estate seems to be the major breaking point.
Transferring the data from the MRI to the hospital Intranet via sneakernet makes it significantly harder for hackers to gain unauthorized remote access to the machine, and is the work of a few moments.
Of course you then need to guard against loss of the removable media - teaching doctors to encrypt thumb drives, etc. You secure one gap but introduce a new failure mode.
The obvious solution would be literally two machines next to each other, with a USB key on a chain so it cannot be walked away with. The key can then lift scans from the MRI host to the network terminal onto the fileshare, where doctors in the hospital - or indeed the patient's GP - can access them.
Still needs thinking about though, as Group Policy in many healthcare networks would disable removable media precisely to prevent data theft/loss on removable media...
Sure, you can make an exception on that one terminal, but then you need to ensure that the USB ports are only ever used to receive data from the (chained) USB storage/MRI host onto the network, and that no one is using that open box to egress other data off the network using the USB drive on their keyring.
Or you could keep the USB ports disabled and have a stack of DVDs, with a shredder next to the desk to trash them after the file transfer. Seems slow though - as others have mentioned these can be multi-GB files, so having to wait for the scan to finish before you can write to DVD, then import it on the network console and wait for it to write across is a significant bottleneck in the workflow.
What's probably better is having your MRI console on a separate network with no internet access - in fact the only thing it connects to is the file server where it has only write access and cannot read out. The file server then makes those files available on the main network via another, physically separate interface with bridging disabled so neither network can see t'other.
"Transferring the data from the MRI to the hospital Intranet via sneakernet makes it significantly harder for hackers to gain unauthorized remote access to the machine"
You don't have to go that far. Just provide a link that is one-way image data only. You could even connect from the RS232 port of the MRI computer to a computer that is on the network, but have only the outgoing wire connected (Tx from MRI to Rx of networked computer). Or have both Tx and Rx connected but ensure that the only commands that will be recognised by the MRI computer from the RS232 port cannot do any damage.
You could even connect from the RS232 port of the MRI computer to a computer that is on the network,
As MRI data sizes are expressed in GB, using RS232 appears suboptimal. I think you'd want something optical (easy to make sure the receiver cannot send) with speeds of at least 100Mbit/s. And only hardware flow control signalling back to the sender.
The ethernet guys need to create a one way networking standard for stuff like this (SCADA being another good example of something that needs one way networking, though RS232 is usually reasonable for its data rates) This would get them off the drudgery of creating this never-ending soup of 2.5, 5, 25, 40, 50, 100, 200, and 400 Gb ethernet over multiple types of fiber and copper media...can't understand why there are ANY copper media standards aside from twisted pair!
I saw some designs for one way ethernet cables for fast ethernet - you needed to fool it with voltage on the return path so it wasn't as simple as cutting wires but doable and would work perfectly well for a protocol like UDP that doesn't need acknowledgement. Having a serial cable as a side channel for checksums and requests to resend when checksums don't match would turn the "U" into a "R" without needing two way communication on the network link.
"You don't have to go that far. Just provide a link that is one-way image data only."
This is a sufficiently common situation that I'd be astonished if someone couldn't come up with a generic solution. Even putting something like a raspberry pi in between the unpatchable equipment and the hospital network would let the IT admins isolate the risk and patch the sole point of contact.
This. You've got a multi-meeelion eurodollar device which you dare not patch for various (some good) reasons. Stick a 1,000 eurodollar firewall/ips system between the network and the device. Allow what needs to be allowed but nothing else. Nail down the config. You can patch the firewall/IPS.
Yes, I know a Raspberry Pi does not cost 1,000 eurodollars but it must be in a case with a fancy logo, right?
Edit: Should have read more of the comments before jumping in - the point's been made further down but earlier.
"I don't see why the MRI machine needs to networked."
I don't see why it is allowed to be. In security terms, if the hospital isn't in control of it, then it is no safer than a laptop belonging to a random member of the general public. (The system owner may be innocent in both cases, but because of the lax patch regime, you don't really know *who* is controlling the machine.)
Qubes is a single user system, by design, even if it hosts multi-user VMs. What you propose would better be addressed with a VDI and/or app container setup such as Docker.
For the issue of internet accessibility versus security, the issue is the same as ANY OTHER NETWORK. It requires planning, knowledge and consistent implementation. My experience with medical facilities is that they focus only on the physical aspects of patient care and are often underfunded for that. Tell them their systems may need to be down for patching and they start playing the "it's a matter of life or death" card and straight up ignoring the very real risks they are accepting by kicking the information security can down the proverbial road. It's not that they don't understand IT or have expertise in IT, it's that they don't want to know or to deal with it because it is outside their wheelhouse.
For background, I have worked with several military medical commands. I also have had to spend more time in hospitals than I want, but nurses love to talk shop. From a security perspective, hospitals rate below public schools in my book, both physical and information.
This is an interesting comment, as a lot of the NHS in the UK used to be linked together by an internal network known as 'N3' (Mostly supplied/interlinked via Zen I believe) and is now known as 'The Health and Social Care Network (HSCN)', meaning they are all interconnected semi-internally. I'm afraid I don't know enough about the Layer3 access implications but such a network does indeed exist in the UK healthcare sector.
internal network known as 'N3' (Mostly supplied/interlinked via Zen I believe)
In the days I was forced to use it (as a 3rd-party supplier) it was run by BT - with all the dysfunction that that implies[1].
If it is run by Zen nowadays that can only be an improvement. Mind you, it'll still be DHS overseeing it so some level of dysfuntional fail is inevitable.
(Oh - and as a 3rd-party supplier we were supposedly firewalled off from N3 and only the contracted ports were allowed. Except nmap proved that I had pretty much full access. And I made damn sure to have a proper frewall protecting me from N3 since they didn't even bother to filter SMB packets. The idea was that N3 was supposed to be a 'trusted' network - as if a national network connecting thousands of sites with little or no firewalling could ever be considered 'trusted').
[1] For an example, if I wanted access to another IP address or IP/port combo, I had to fill in a form. By hand. Which then had to be sent to BT. I once asked if I could fax it and was told I could. Except that the N3 admin office had no fax number.. And didn't accept submissions by email. And only had a PO box number for submissions - that had a roughly 50% loss rate. And didn't accept anything other than the original form - but yet allowed you to phone their service desk who would do changes on the fly if you gave them valid details.. One of the many, many reasons why I was happy to leave that job.
"... don't connect your damn hospital's internal system to the damn internet!"
It's a problem that has been solved elsewhere. There are several approaches to transmitting data from inherently insecure systems to secure systems and for managing internet connectivity. However the work involved is rarely considered when purchasing Medical systems, SCADA systems etc. The people controlling finance usually consider security as "something that gets in the way and costs money". Real PHB stuff.
Diagnostic equipment and patient records should be on separate (v)LANs to the user/public/internet systems and each other. There should be a controlled gateway which enforces separation and antique systems that don't get patched should be regarded as being as much of a threat as the Internet.
I think this report seems to be looking at it the wrong way round, indicating that someone thinks that medical systems should be frequently patched/updated. Yes that's one way of doing things but it's not as good a solution as the above and it causes the regulatory headaches mentioned in the article.
Here's how it works: patient details on central booking system and RIS (Radiology Information System), patient arrives and is booked as 'attended', patient demographic data sent to radiology modality (CT, MRI, CR, DX &c. on the 'worklist'). Patient goes to be examined, radiographer picked patient from worklist - this ensures all patient demographic details are correctly added to image files.
Images taken, sent to PACS (Picture Archival Communication System) and (possibly) a local workstation. Images then available for reporting on workstations and use for physicians/surgeons.
An MRI study may be a GB or two, a CT can easily be 5GB to 10GB. So no, sneakernet or USB is a bloody silly idea; it would introduce huge delays in getting demographic data to modality and even bigger delays in exporting images. We'd need to find ten times as many scanners and staff to run them.
Let's be clear, when someone has their head hanging off in A&E, you really do't want to be pratting around manhandling data for an hour before the traumatologist gets to see it. Neither do you want to introduce the high risk of sawing the wrong leg off by manually inputting data with no verification.
In addition, manufacturers have restricted access via N3 for service and QA purposes, kill that and you'd increase downtime and, of course, waiting list delays.
A gigabit dedicated filter upstream of each modality might work, you'd only need to open it to maybe 3 ports. (Dicom Q/R and RIS, plus some more secure way for service access).
I can remember delays of a couple of years before FDA approval came through to upgrade CR readers from XP to Vista - and that's only 3 or 4 years back (and I think they are still on Vista) - similarly approval to put antivirus on.
Right now, there are still a load of dedicated modality (£50K+) workstations running XP from manufacturers that I can (but shouldn't) name. Plenty of others have shifted to Red Htt, however.
If there is intent to actually use the data in a meaningful way, at some point there must be a gateway/switch/something between the MedDev network & the primary network. Attacker gets into the primary by whatever means, exploits gateway (probably not an amateur's job admittedly) and then bingo. Random fly-by attacks (ransomware etc) maybe low-risk but a targeted attack should be on a risk register at the very least.
> "This creates a problematic situation in cybersecurity because when a medical device has been tested and sold to a hospital, a vendor is focused on creating the future wave of whatever medical devices they are working on," Zilbiger said.
Healthcare regulations should mean devices have to be thoroughly tested and validated. Full stop.
I don't see how that affects what a device manufacturer chooses to develop next, or when it chooses to do so.
Should a licenced device manufacturer find out a device has been hacked and it's performance has been changed affecting patient safety, then that would fall under the "vigilance" part of a manufacturers obligations. eg
https://www.gov.uk/government/collections/medical-devices-guidance-for-manufacturers-on-vigilance
So are scanners etc being let off the hook by not being licenced like heart valves, glucose test strips, scales, treadmills etc or are manufacturers just keeping their fingers crossed.
So are scanners etc being let off the hook by not being licenced like heart valves, glucose test strips, scales, treadmills etc or are manufacturers just keeping their fingers crossed.
Anything important or critical to life would be patched.
What it means is that (prior to WannaCry), they were not going to bother going through the entire re-certification process to implement SMB3 on the console. It worked as sold and as described.
The mood is now turning that they are going to be required to keep up with technology. If you sell a bit of hardware with a projected 10-20 year life span, you are going to have to port your software to newer OSs and patch for things like TLS/SMB version deprecations instead of requiring customers to keep old servers around to talk to your old XP-based consoles.
There are two aspects here - regulators need to require it and hold vendors accountable, but they also need to make their approval processes quick and streamlined.
You might be able to put your software through a ruinously expensive one-off testing and approval process the first time, but if the process is based around one-off approvals and is too difficult and expensive for things like point updates (e.g. disabling SMB1/implementing SMB3), then it won't get done. Ultimately the cost of that is borne by the customer (the hospitals), who have a finite budget. So the regulators need to ensure they're striking the right balance between being vigilant, but also letting vendors get updates out in a timely and cost-effective fashion.
Take an old controller PC and a new one. Feed them the same inputs, and check you get the same outputs. You don't even need to hook them to a real scanner. You had a test suite, right?
Divide the cost of the retesting between your customers - Just add it to their maintenance contracts. It's cheaper for them than buying a new scanner or killing someone.
I've been arguing this for years, and no-one has ever given me a reasonable explanation of why I might be wrong. Maybe this time
is to keep personal data off the devices in the first place. We have devices in space , for example that are quite capable of having the analysis of their data collected elsewhere.
the matching of other personal data, mapping, and diagnostics can be done on a secured system. the reliance of these devices to try to perform everything on-board will of course make changes difficult not to mention requiring outages and probably requalification's.
Yes, because there's no way that you need to be able to link an MRI scan or a clinical chemistry record to the patient that the results refer to.
The potentially-insecure device should only have a "transaction record" number. Send the device the parameters it needs, along with the transaction number. After the scan/procedure, send the corresponding data back up to your data storage, where a secured and regularly re-evaluated system connects the raw scan data to the patient record. You my need a *secured* terminal for the device operator to confirm the data and the patient are properly matched, but this in no way needs to be connected to the insecured device.
I also expect this equipment should not be connected full-time. Burst transmissions should be adequate; receive the parameters in one burst, send results back in another burst. Firmware updates done with a laptop, checked for cleanliness/security before being brought to the device to be updated, with the device's "burst network" disabled. The less time equipment spends connected, the less chance of vulnerabilities being exploited.
"They are really not investing too much effort into upgrading the previously sold medical devices because of security reasons. They might fix something because of health issues very quickly but they're not really looking into improvements that need to be made to [existing] equipment because of cybersecurity.
Cybersecurity is a must and the past few years have just proven what we already knew.
If these businesses don't care about security then there needs to be fundamental changes to the way they operate.
If hospitals started demanding security updates for the devices that are sold to them instead of just accepting the status quo maybe things would change. If hospitals don't buy their equipment the industry might just start to listen.
"If hospitals don't buy their equipment (from an industry where all the players operate very similar licence terms) people might die"
Whereas if hospitals *do* buy their equipment from vendors that are running a file sharing protocol that was superseded on security grounds over a decade ago, then people might die.
One issue that I heard about was that the device(s) have a far longer life than the OS on the underlying support terminal. So your MRI/CT whatever scanner might have been built before a newly discovered problem with a 5/10 or more year old device becomes apparent. So how do you find a way to solve that issue within a budget? It is not a machine function issue, as it still works fine for the role it was signed do, but rather in many ways it is a site management, i.e. 'client' issue. Just as driving a truck into the building, losing power, or having someone walk off with an essential part - or even use the embedded 'terminal' for some 'foreign', i.e. not the task it was installed to do, purpose, I had someone do that with a (non medical) in service live device years ago. They were not happy when it overwrote their data - naughty boy.
One issue that I heard about was that the device(s) have a far longer life than the OS on the underlying support terminal. So your MRI/CT whatever scanner might have been built before a newly discovered problem with a 5/10 or more year old device becomes apparent.
I fail to see exactly *WHAT* there is to any of these control systems (medical, industrial, etc) that is SO operating-system dependent. Seems like if something is so dependent on a particular brand/revision of an OS, it was shitty design/coding in the first place. And I can only see *that* element getting worse.
One computer to handle the medical function networked by an internal link to a firewall/security computer. All control of the medical device is on one computer that runs the approved (and often years out of date) software for the device. The second (firewall) computer as it does not control the device can be kept up to date to deal with evolving security threats. There must be NO external (outside the device) network connection except via the firewall computer. Any USB (or similar) ports on the control computer must be behind locked access panels (or disabled with epoxy glue).
"All control of the medical device is on one computer that runs the approved (and often years out of date) software for the device."
There's an additional problem if that computer fails and the approved S/W is unable to run on current H/W. There's a periodic need to update the S/W to keep up with what's available in the market place.
"There's an additional problem if that computer fails and the approved S/W is unable to run on current H/W. There's a periodic need to update the S/W to keep up with what's available in the market place."
That one is easy. You insist that the vendor either maintains it for the period specified in the contract or publishes sufficient information for you to do so. Failure to do either results in paying back x% of the purchase price, where x% is the percentage of the advertised product lifetime that turned out to be untrue.
Couldn't agree more with your solution.
Just had to scan a SCSI disk from a Scanning Electron Microscope whose controlling PC was running Windows 2000. Even better when the machine has PS2 connectors for keyboard and mouse.
As soon as the PC goes so does the microscope!
Once again we get the "oh that sounds too complicated" comment when we ask to do it and the using DVD burners has it's whole range of issues. :-)
of concern to me, as my wife is currently rocking a SynchroMed II pump for baclofen. No public details on what interface the unit uses (I suspect some sort of NFC) or any security protocols.
So how can we be sure that the person we next see, for a check up, is bona fide, and doesn't adjust the pump to deliver a massive single dose (breathing and heart failure, probably).
OK, not quite "Homeland", but who's checking ?
"So how can we be sure that the person we next see, for a check up, is bona fide, and doesn't adjust the pump to deliver a massive single dose (breathing and heart failure, probably)."
This is silly. The scenario is that a technical/medical expert, has decided to kill you, you are unware of this and he has direct physical access to you and medical equipment connected to you. You are worried about computer security on the device! That is the least of your worries.
The sad truth is that if a medical professional responsible for your care wants to kill you he almost certainly can and computer security makes no difference to this.
The premise of the article is wrong. Security updates for medical devices are a problem but the issue is nothing to do with a focus on the next generation of device but the need to fully check and test any changes to the device before deploying it. The device concerned are complex with many processors, an internally heterogeneous software environment and many differnt options and configurations. Usually healthcare instituitions are forbidden from making software updates that have not been specifically authorised by the vendor. this is sensible I have seen a medical image workstaion broken by applying an unauthorised update. Hazards arising from connection to a network are required to be considered as part of the regulatory process and usually the control method are going to involve disabling every uneeded service and requiring that the device is on a private network with limitted access and appropriate security. Allowing people who have no detailled knowledge of a devices design and implementation to make updates whenever they feel like it would be a recipe for many broken devices and an occasional disaster.
It's a tough one, though, isn't it? There's a bit of a difference between finding out that a the kit has a port open and discovering that the CPU at the centre of the kit is vulnerable because of fundamental architecture design. Creating and pricing a contract between developer, vendor and customer to cover that range of installed performance would be tough. You can use words like "forseeable" and "fit for purpose" but someone has to be paid to carry the risk - either in up front costs or lease/support fees.
Air gapping looks (to me - but it's not my field) like the best tech option, but at the cost of losing the diagnostic and analysis benefits that connectivity probably brings.
"Air gapping looks (to me - but it's not my field) like the best tech option,"
I think that is too strong. Taking Wannacry as an example, if the dodgy device has the power to write results to a network share, but only supports an insecure protocol, you can protect it by placing that network share on a raspberry-pi-sized device with two network sockets. The one facing the device supports the dodgy protocol and the one facing the rest of the hospital network is secured by the IT admins to current standards.
Call it fruit-gapping, if you like. The point is that the fruit-gapping device is cheap, simple to set up, transparent to the device, and entirely owned and controlled by the hospital staff.
Again, lack of basic understanding. Some medical machine is not running 50 vm's and random user generated processes in a cloud rack. It's a dedicated hardware appliance. Specter/Meltdown are not relevant. Generally the PC component is for control and data analysis of the hardware system. Like an old CNC or a shiney new EWACS. The problems normally are that you have to leave lower security protocols in use on the net because the control PC "Don't Talk TLS1.2". And of course you don't want the control system compromised because ie5 is running on XP, and the operator was surfing porn sites from the control console. These are manageable issues with proper care.
And the idiot users at my company that just keep clicking on effing phishing and putting in their creds to get the doc. 200 people out of 1500 clicked the link this time. Never heard of xyz cleaning service but I sure do want to see the invoice from this dude I've never had an interaction with, ever, in our company! Since I'm now on compromised.com's https site, I'd better put in my o365 email and pass, plus my security questions, cause my password didn't work! It's stunning.
East-west mime de-fang coming to a neighborhood near you... It's only money.
Leave the network as is, change the router at the edge and enact a policy of no device gets internet access without explicit consent.
The X-Rays and scans can be flung around internally all they like, other changes may be needed later to help but in one go you've stopped random outside access to a vast chunk of the equipment.
Device techs and what not could be given VPN or similar if they need to connect to the devices.
I know it's not a full plan but it is a lot simpler than half of what I've seen in the comments here, at the end of the day no one is going to go carving up the network or reworking everything.
I'd be personally more worried, about the number of long term pieces of kit (like MRI scanners) that are operating on WINDOWS XP machines, or VISTA and are no longer supported.
Kit like MRI scanners are not replaced often, neither are the attached peripherals updated often with new firmware or Drivers.
It's the same with most industrial equipment, with a 20 year life (like automated car production plants), using SCADA or PLC type implementations. You need micro-segmentation and other mitigations (IPSec) to try to protect this vulnerable but valuable kit, it's perfectly do-able - its just the best Infrastructure guys work in finance / defence where the money is....Not the local Hospital.
I've worked with products that had long arduous test cycles. Too bad; it's time to add vulnerability management to the criteria, and strong hardening and security controls to the products--along with a path to patching. Yeah, it's going to cost, but hey: "human life".
> In banking, for example, you can accept a few glitches but when it comes to human life you cannot have that, of course
I'm surprised that anyone can say that with a straight face, given that hundreds of thousands of patients die from medical errors every year.
Every time windows issues an update we have to do software testing before we can allow that update onto machines in a doctor's office
Perhaps we should also repeat all the EMC testing .
And we could repeat the usability testing, with 10-20 consultant surgeons, in case it has the frame rate and perceived latency.
Can't be too careful - it does mean that the PC in the doctors office for viewing your X-rays is now going to cost as much as an MRI machine - but you do have an extra 350M/week to pay for this from next year.
There needs to be some regulation of the vendor and/or customer maintenance and upgrade process in the certification of such equipment. Something akin to the way the FAA certifies both the aircraft design plus the operator, operator's maintenance process and manufacturer as a condition for the safe operation of a commercial airplane.
Manufacturers will have to commit to an ongoing maintenance and upgrade process, including security updates. Or their customers risk losing certification to use that equipment immediately. Customers and vendors will have to make purchase contracts and pricing decisions based upon this support. And manufacturers will be motivated to rely on maintainable hardware and software platforms as a strategy to keep support costs down.
The infamous Therac-25 machine was directly dangerous, historically due to a bug in the GUI.
But I thought that MRI scans were 'mostly harmless', in the sense that if you had 100 MRI scans in a week, it still wasn't a significant direct health risk issue.
CAT scans, being X-Rays on steroids would be a better current example of a machine that should safeguarded.
Did I miss a memo about the dangers of MRI scans?
I worked 20 years ago developing research MRI type equipment.
Firstly MRI scans are totally harmless.
Secondly there are usually quite a few computers involved in an MRI/NMR type setup.
a) there is the box you sit in front of the operate the machine which is what the conversation is about
b) there is another computer (used to be a RISC chip on a EuroCard) embedded in the machine/console that runs a bespoke operating system that controls the machine end of things. This is usually connected to the console by a private ethernet link.
c) then there is a third computer (sometimes called a process controller) that actually controls generating the pulses, switching on the receiver/digitiser. This is semi autonomous and only updates from computer (b) at the end of each cycle. In NMR/MRI timing is the key and therefore the activities on this card cannot be interrupted. This card is usually programmed in a very obscure bespoke language that in known to small handfuls of people.
There is absolutely no need to have a browser mail etc on an MRI console: it is single purpose dedicated machine. That can, in that respect, be totally locked down.
It would be cheaper and better to have a separate computer for browsing and email than to try and mitigate risks like these. Or give them an iPad.
Honestly it is close to impossible to reprogram that lot so that it works and is totally secure. There will have been 10 PhD level guys working on developing a debugging it over a period of years. I spent long nights staring at an oscilloscope wondering why the sensible commands I had programmed had failed to produce the required pulses: usually because some obscure delay was not long enough to allow some element of the hardware to prepare for the next cycle. To redo and retest the three interacting OS's dynamically every time something came up would be a total disaster as there are so many variations on each type of machine with specialist add ons.
As I posted last year when this came up the only real solution is to actually put the data straight onto a local NAS that is running up to date software. You can then have the insecure protocol SMB1 etc on the private side of the network with all other ports nailed shut. You then have the networked side of the NAS open with say only SMB3 with all other ports nailed shut. If you are really paranoid you can also put another physical firewall in either side (public private) to enforce those preferences (so there is absolutely no way the ports can leak to both public private sides this would mean hacking three lined up devices serially as they are one behind the other). Because the NAS is a modern machine you can run dynamic encryption on the data as well which would mitigate the data leak risk substantially.
At the end of the day it is of more benefit to people to keep these scanners running and getting people the treatments that they urgently needs and then to take sensible proportionate steps to mitigate risk. At the end of the day if you gain control of the machine what do you achieve if there is no data on it (it would be on the NAS encrypted) you might be able to damage one of the amplifiers by turning it on until the thermal fuse trips. You might, if you had ultra specialist knowledge be able to turn the amplifier 500W on and turn on the very sensitive receiver/digitiser and fry it with the RF power. All very specialist and very very obscure and it would be bloody obvious which of the 20 people in the world who knew how to do it had done it!
AC read only the thread title "Is an MRI Machine really a good example?" and then instinctively noted "Therac-25 Is Radiation therapy machine."
@AC You failed to read the first sentence where it was already noted, "The infamous Therac-25..."
Careless reading, but good instinct.
A normally conducted MRI scan itself is harmless. And unlike X-Ray there is no cumulative / repeated exposure risk or long-term health implications.
There are some dangers, but not from a normal scan:
Field. It's a really big magnet. Ferromagnetic tools and items flying into the bore are dangerous, every MRI site has strict screening to make sure you don't bring something magnetic near the scanner. Medical implants (most are now non-magnetic) and injuries involving shrapnel or metal splinters need careful checking.
Heating. There are strict specific absorption rate limits, and people often get weighed before a scan. Sequences are designed not to exceed safe power limits, and if you try to run one that might exceed it for a given weight the scanner wont let you. These are set conservatively. Tattoos and certain metal implants absorb more RF power, there are lower SAR limits for people with these, which may restrict the sequences you can use.
Contrast agents. MRI doesn't fundamentally require contrast agents, but they're used for certain things (often tumour investigation), injecting contrast agents for any imaging method carries a risk of adverse reactions.
It may not be the best example, but all parts of the infrastructure need to be protected/secured. It could be a valid point in that any control devices left unsecured can be used as a jumping-off point for the actual target. An MRI machine may not be a target in itself, but it's wide-open, unsecured MSWinXP controller makes for a nice "zombie" to infect and infiltrate the rest of the network.
something something something raspberry pi
...watches with curiosity as register readers rediscover the concept of a firewall.
Air-gapping such a system is likely overkill. It's not a high priority target in the way key infrastructure (power plant control systems or your home uranium refinement project) are. It's a potentially unpatched subnet of computers (as another poster mentions, in reality there's more than one machine involved), which needs to be able to transfer scans out and possibly in. The risks are loss of personal data and undirected malware causing loss of service or (theoretically) hardware damage. You can definitely firewall it down to only allowing the required PACS connections to prevent that. If you want to air gap you take on the possibility of stuff going wrong with that manual data transfer, leading to loss of personal data and loss of service...
A lot of high-tech solutions are being offered here. How about an old-fashion, low-tech solution? For many situations, a nailed-up data line would suffice, or even a dial-up line for equipment not updated often. There were hospitals, insurance companies, and vendors before the internet. They were interconnected with dedicated circuits. The only real difference between doing that and using the internet is cost, but just how much are you saving if you have constant nightmare security concerns using the internet?