Well they're not using it
So why doesn't NASA just open source the bloody thing while they potter about, working out how next to get to NEO that doesn't involve the Russians?
NASA officials failed to wipe sensitive agency data from computers before releasing them to the public, a violation of procedures that are part of the plan to securely end the Space Shuttle program, an audit released on Tuesday said. Kennedy Space Center in Florida – one of four NASA sites with reported weaknesses in the …
please forgive my ignorance.
Considering how sensitive the data that _used_ to be on the PC before it was retired, wouldn't it be safer to _destroy_ the PC instead of trying to get few quids by selling it? How about selling the PC without the harddisk?
I understand the economy of selling those PCs as second hand, but even with a proper erase of the data, is it worth the risk? (keeping in mind where it was used and by who).
thanks for the funny comment,
any way, _destroyed_ harddrives are useless harddrivers, aren't them? (note, physically destroyed)
P.S. note that I am not making any reference to the current article, but the practice of selling PCs that _might_ have sensitive data on them. Why take a chance with your auditor's PC?
Err, no. It all depends how they were destroyed.
Properly wiped (not just re-formatted) then it shouldn't matter. Merely shattering the platters on a an unwiped disk is not enough to stop a well resourced and determined individual/group/government from retrieving data from the fragments. If you're not going to wipe the disk first then the physical destruction needs to be more complete e.g. grinding the platters to dust.
If the platter is only cracked, you might be able to save a few files here or there, but if it's in more than a couple pieces, that data is gone. Especially since the way most hard drives are "damaged" results in very, very small pieces.
And its not like you can;t just do a a DNS lookup to find their Public IP addresses. And the whole 'export of arms' thing actually covers encryption as well, so the Shuttle could have something like an SSH client on it and that would prevent it from being exported to anywhere except Canada.
As for the Hard drives, why don;t they just load them into a rocket next time they launch a satellite? The hard drives could be ejected as the rocket body is coming down and then they'll just burn up in the atmosphere. Or they could just leave them under the rocket as it launches.
Flames, as that is what they need to do with their hard drives
"And its not like you can;t just do a a DNS lookup to find their Public IP addresses. "
You think the WAN IP they use for their PRIVATE network connection to the internet is the one matching "www.nasa.gov' that you can find with Whois on Network Solutions?
You clearly need to learn something about the Internet and Secrecy.
There is something fundamentally wrong with the way we manage and track data, and I believe it has to with the antiquated filing systems that we use on computers.
A PC filing system that used a file structure which included a sophisticated metadata system would be able to track all copies of a file no matter where they were located as the copying process would automatically update records, central, local or otherwise, with the locations of all copies of a file.
In such a schema all files would be unique. Whilst the encapsulated contents of a file, the document etc., would be identical from one file to the next, the file's accompanying metadata would not. The metadata would include a file's history and various other information such as when the file was copied, by whom it was copied, revised, updated etc. together with data such as on which machines files were stored or copied to or from or where archive copies were kept etc.
Whilst this sounds awfully big-brotherish and it is, it is in fact little more than the schemes used in the more sophisticated manual document management systems that have been around for centuries. Many old fashioned manual document management schemes have both file cover sheets and central repository records that show and log the file's usage, who has possessed/read or added info to it etc.
Both UNIX and PC (MS Windows) filing systems are a sick joke when it comes to document management as keeping tabs on all aspects of important files is nigh on impossible as their inbuilt attributing simply can't support the necessary metadata (the necessary records/indexes do not exist).
Had a proper document management filing system been in place when the PCs were disposed of then central records management would have specifically warned of the need to clean certain specific records off the equipment before disposal.
Of course, this would require existing operating systems to bolt on new document management filing systems. In practice, this would mean a rewrite of MS Windows for instance. Not that any of this is a new concept, Microsoft promised the Win-FS database filing system with Vista but it was never delivered and we've still no idea when it it will be. (Win-FS is not the complete solution but it would have in part gone a long way to fix the problem, as at its core it had a database which could have tracked file metadata.)
If organizations such as NASA are really serious about document management and security then they'll force companies such as Microsoft to include a sophisticated filing system within the O/S and they'll do so on the threat of using something else.
> the copying process would automatically update records ... with the locations of all copies of a file.
That fails as soon as the data is passed off such a FS and onto one which does not support the metadata; even if you record the copy (which is certainly not guaranteed in most environments), the receiving machine has no facilities to retain the copying info, and would neither honour it nor pass on any future information. Your solution requires 100% coverage - so all those CDs, DVD, memory sticks etc. all have to go in the bin; each one breaks the model.
A far better solution is to have someone actually being *responsible* for signing off that the machine has been properly wiped. Selling a machine that has not been wiped => instant dismissal, and probable negligence claims (or worse). It is only when such bureaucrats face personal liability that this situation is prevented; having a department or company shield perpetrators from the consequences of their actions means that it's not quite as important to get it right...
Or, of course, you could just not sell any recordable media. I wonder if knocking out second-hand PCs really makes much difference to the budget of an organisation like NASA.
It's difficult to explain systems in these posts. Too comprehensive and long, no one will read it and a precis has vital bits missing (as in my previous post). A clue that I omitted key points for brevity comes from my comment that Win-FS only goes part way to solving the problem. What I'm about to add is also an oversimplification.
A database filing system that makes extensive use of metadata intrinsically would include both encryption and authentication which means that they can be locked (dongled) to the hardware or O/S or both or even the log-on user and would need new authorisation or certificates if used elsewhere. Files copied to an environment that does not support extended metadata (i.e.: copying into any normal Windows system) would (a) not authenticate and (b) not decrypt. Moreover, both authentication and encryption would be integrally controlled with the central secure server. Stolen files or those accidentally left on a PC for sale would remain encrypted. As each file has its own unique encapsulation--hence encryption key, files left on a PC for disposal would be readily identifiable, thus the sloppy user/disposer easily singled out for sanction.
Of course, this doesn't stop data getting stolen by various other means such as copying directly off the screen etc. but secure data management is significantly easier when data is locked to specific machine/environment and only decrypted after various authentication processes both local and remote.
This is not some fanciful idea of mine, specialised systems such as this already exist and have done so for years, furthermore, I've worked with them.
The fact is that even with responsible people in charge you cannot rely on absolutely 100.000% of PCs getting scrubbed before sale/disposal every time. There are just too many organisational/operational issues/variables involved in any corporate or government environment to do so, especially so if security has to be guaranteed over time--say for many years (systems get slack).
It's just a fact but very secure data simply cannot be considered secure if it exists in plain text form on normal computer systems, even if those computer systems are physically bolted securely behind the doors of Fort Knox.
Yes, sorry, I thought you were advocating DMSs being built into OSs so my comment was more: Leave the OS as it is and use a DMS.
I agree with you though that if you have confidential documents they should be managed using a proper DMS rather than just relying on a file system and a few permissions.
> It's difficult to explain systems in these posts
Actually, it's not that tricky. The difficulty you're having is that the system doesn't work; it's been tried in many differnt guises, and it always fails eventually.
> A clue that I omitted key points for brevity comes from my comment that Win-FS
> only goes part way to solving the problem.
It doesn't actually form any part of the problem - as we'll see from your clarification post.
> would include both encryption and authentication which means that they can be locked (dongled)
*That* is your DRM. The DB-backed FS is irrelevant to it.
But have a look round at all the different methods of preventing software copying that have been tried over the years - how many of them have been successful in the long term?
But as soon as that DRM is removed - the whole system falls apart.
Do you thing you can write an unbreakable DRM? Lots of people have tried. Success tends to be a bit rarer, though.
> specialised systems such as this already exist and have done so for years
But they are not infullible. They are an aid to data security, but they do not work on their own; they need to be backed up by a data-security mindset in those people using them.
> The fact is that even with responsible people in charge you cannot rely on absolutely
> 100.000% of PCs getting scrubbed before sale/disposal every time.
Well, you can. I built part of a system to do exactly that some while ago. HDDs have serial numbers, and it's easy to create a record of successful wiping (I modified nwipe to record to a database). Then - and this is the important bit - you make it clear to everyone dealing with disposal that any HDDs not in the database are company property, and removing them from site is theft, and thieves will be prosecuted. The database is trivially checked by a bootable image (USB stick is faster, but you need CDs for some of the older kit), giving a simple yes/no answer as to whether or not this system is considered "clean".
The strength of this system is not in technology - that's pretty simple stuff. It works because people have a personal motivation not to circumvent the rules.
I think your solution is overkill -- there is no reason for, for example, my home PC to have an overly redundant file system requiring gigabytes of metadata when all I use my PC for is watching movies and surfing the internet. OSs should not be rewritten to make them even more complex to deal with this type of problem.
Instead, something called a "document management system" can be used, which typically bolts onto Word, Excel, Outlook and any other office programs and provides all the functionality you suggest. I am lead to believe that these are common in the Legal world and I'm sure in other places too.
I never mentioned home PCs, we were discussing data that might have serious ramifications if it got on the loose. For example satellite control codes, if in the wrong hands, could be used to take a satellite out of orbit. I thought this would have been obvious from the news report itself.
I'm sorry that my precis of the outlined system omitted key security features for brevity. I've attempted to clarify this in my post '@Vic.' Perhaps you should scan through it.
You might look at some of the less well used commercial OS's like OS400 (IBM i series) and DEC's VMS.
IIRC both had substantial provision for creating links to single versions of a file rather than creating multiple copies for special uses. VMS also did "rollback" to earlier versions (disk space permitting)
No PC harddrives, boot from a CD/DVD and store information (i.e. demotivational, LOLCats and other essential data) on a NAS.
Never dispose of a PC with a HD installed - take it out and disassemble it (or in my case store it in a box in the shed until it rusts over) and if you're bored and paranoid attack the disk with a dremel (TM).
Just like the client server systems of yesteryear (in my case a Vax) - lite clients have a lot going for them.
I agree with you fully about the removal of hard disk drives but the fact is that a secondhand PC without disk drives is worth just about nothing (buyers all seem to knows this). Even if the buyer intends to put in a new/bigger hard disk he still wants to see it working before he purchases it, thus even without an operating system a hard disk adds value from its boot-up/log-on info (it shows that the disk subsystem is at least working).
Consequently, accountants etc. want the machines sold as intact as is possible (it's the conflicting requirements/issues problem so even edicts to remove HDs often go unheeded). In my addendum post '@Vic' I cover some of the problematic issues with data which should be highly secured.
Re hard disk disposal.
Ideally, one should dispose of HDs in the newer specially made hard disk pulverizers which reduced them to powder in a few seconds but they're still not that common. Alternatively, as I discovered many years back, a steamroller is an excellent way of dealing with 'used' HDs but they're very uncommon nowadays (unless there's roadworks going on outside your house).
However, a really good substitute is the ubiquitous forklift truck and just about every organization has one these days. An enterprising forklift driver can reduce a HD to rubble within a few seconds--for example, strategically placing a drive across say a gap in concrete will snap the casting into many pieces (even just running over it on a flat surface will often do much the same). If the disk platters are of the glass variety then all you end up with is tiny little shards (which must be swept up as they're sharp). (The physics of entropy says the data is all still there but I'd place very good money on any forensic type not being able to extract any useful info before he was carried away in a box.)
Again be careful if using a Demel or hammer on the newer drives, as the platters use a glass material as the substrate rather than the traditional aluminium, this can shatter somewhat violently (as does any glass when hit with a hammer).
Is that this data should actually be going to a museum.
Bugger the "security" - what? some alloys? the fact that the Shuttle launched and retreived a few KH-11 spy sats? the fact that the Hubble was a repurposed KH-11 with deliberately knobbled optics so the hoi-poloi wouldn't find out how good the KH-11s were? (for the doubters... http://www.globalsecurity.org/space/systems/kh-11-schem.htm)
The Shuttle program, maligned and overpriced it might have been, it will be sad to see it go the way of the Apollo/Saturn program, with the documents and tools destroyed, the files erased and only a few poor gutted wrecks left with rot in poorly-designed museums, or worse, left outside on poles, or even worse, used for target practice by DARPA. Don't tell me they wouldn't love that!
I find it sad that such a wonderful technological achievement of the human race can be thrown away and forgotten in the name of 'security'.
What a sad frelling race of animals we are.
This post has been deleted by its author
The head of IT rides the people responsible for wiping the data like a horse so bad that they have little or no time to do thing properl and every second and minute is scrutinized to make sure they are working and they are so over worked that they have no time to do things and so they put this hardware in a room to work on it at a later date then after a few weeks the old computers are then thought as low priority and then forgotten about then something like a bug spray or facilities move is in order and IT moves to a new building and the computers get lost or something like that.
It is not the emplyees who are supposed to do the wiping that should get fired but its the management that rides the IT people like a horse that needs to be fired.
YOU MICROMANAGEMENT JERKS!
"I dont understand why they dont use VM desktops and just boot to an IP with an image of a desktop they need. All the data is then stored on a server. No local hard drives to mess with."
Well it might help you to know that the Shuttle programme started building them around 1974 and flying them around 1981 and have carried on servicing them ever since.
What your talking about did not *exist* for most of the Shuttles history (not in the form you're talking about).
If you don't understand the context you'll probably not understand why *this* solution was chosen.
Smiley as we were all young once.
Not so long ago, I bought an SGI Onyx on eBay. Previous owner had never managed to get it going due to not being able to boot SU due to hardware password. I, being more of a hacker, removed the HW password without too much trouble, got it booted... and found it was a server from NASA Goddard Space Flight Center, with a LOT of interesting stuff on it!
Biting the hand that feeds IT © 1998–2022