
Just tell Google - they will then tell MS and give them 90 day to fix before publishing it.
An ethical hacking student at the University of Northumbria has claimed that the university's ethics board and the Wassenaar Arrangement have forced him to delete some references to exploits from his final year dissertation. Grant Willcox, a BSc student studying Ethical Hacking for Computer Security, claimed in a blog post …
That were removed from my hands with a thank you (and a nice check), the hard drives scrubbed, and the documents put somewhere that I had no access to anymore - even though I obviously had the clearances.
I can't tell you that these documents might have involved penetration testing of various networks back the the 80's and 90's. I'm sure everything is much more secure now.
"I'm sure everything is much more secure now."
Presumably your papers were buried where the sun doesn't shine, because the recent history indicates they haven't even managed to get around to resetting default passwords yet. It's a pity that joe public pays the price for their fuck ups and willful ignorance of the clowns running the circus, and there doesn't seem to be anyway to replace them with people outside of the circus community.
"Does this imply that you can't tell foreign software companies about security holes you have found in their products?"
AFAICS, yes. It would also be illegal for any criminal to make use of the same holes should they discover them. Smart, very smart. Aren't we lucky we have such smart people looking after us?
The proposed changes to the WA in December 2013 are coming into effect now in the signatory countries and it knocks security research on the head. I'm sure all British universities have been advised but one of them had to be first.
So no commenting on the mountain of brain-dead OpenSSL bugs on, say, this forum. The world will be a better place for it, citizen.
"
"it is not possible to release the exploits publicly or even to other researchers outside the UK without an export license"
Does this imply that you can't tell foreign software companies about security holes you have found in their products?"
I suspect you can release the exploits *privately* to the vendor in question. In my experience that doesn't work very well though, because 100% of the vulns I found & reported (all privately) were ignored by the vendor despite being exploited daily.
I suspect that in the vast majority of cases it is the possibility of public disclosure that actually motivates vendors to fix their products, consequently it will be a massive loss to everyone if public disclosure is criminalized.
So the Wassenaar Arrangement will now discourage people from:
a) Looking for security vulnerabilities and publishing information about them,
b) Informing companies and the greater public about these security vulnerablities so they can't take steps to protect themselves.
Correct me if I missed anything.
As any abused child, battered wife, or security researcher will tell you: hiding problems doesn't make them go away. Whereas the University and the researcher will most likely comply with the letter of the law, the person who decides to use vulns for unlawful, unethical purposes will not.
How are we supposed to protect ourselves from the latter party? Will this law help us catch said party, when the vuln they are using is classified? Hmmm, maybe..... but probably not. If anything it will make the bad guys lives much simpler. The spread of knowledge and ideas is extremely difficult to legislate against.
We would be better off restricting the export of mentally retarded law makers and asinine policies.
Actually what it does is ensure that the government is the only entity which can receive/triage information about vulnerabilities, which is the whole point. It takes self-defense out of the hands of people and makes it a national security policy issue. Why pay the market rate for vulnerability research when you can make the entire existing market illegal and make yourself the market? Then, when new vulnerabilities are discovered; you get exclusive access to all of them before the general public is aware of them. If your focus is to gain leverage on domestic industry, then it's a smart move. If your focus is national defense, then probably not, because nation states always find a way around export controls.
"Actually what it does is ensure that the government is the only entity which can receive/triage information about vulnerabilities, which is the whole point."
In the somewhat cretinous headspace the people who make these laws live in, this might make some sense.
The bad news is that outlawing vuln research, discovery and disclosure will do no such thing, it just means vulns are less likely to become public knowledge or that manufacturers will ever fix them.
If it became illegal to report that a certain door lock could be sprung open by a special one - two hand twist, the people who had bought the door lock would not know about it. But clever burglars would know and these homeowners would be defenseless against their knowledge.
Stupid, stupid, stupid does not begin to describe. In an inter-networked world where knowledge is transmitted at the speed of light, it just won't work. It is time to accept this.
"Correct me if I missed anything."
Open source projects jointly worked on by people all over the world. Finding and fixing OpenSSL vulns is probably the most obvious example currently.
For that matter, companies like MS and Google have programming teams all over the world. If an MS bod in Reading, UK finds a bug/vuln in code written in Redmond they can't inform Redmond but might be able to fix it and send a patch but can't tell them why.
Oh what a tangled web we weave when the law of unintended consequences comes into play.
Looks like time to quit being a white hat and become a grey hat. I won't recommend becoming a blackhat, that selling 'sploits to card scammers and such is greasy. But I can guarantee if I found an exploit I wanted to tell people about, well, you could tell me no all you want but I would do it anyway (unless there's an ACTUAL concrete reason for that "no".)
Why not just simply not export the research? Print copies, hand electronic copies to the examiners, but do not publish it online. Now you haven't exported it; problem solved. Surely exporting is not a requirement for the research to count towards assessment, or are some of the examiners overseas?
Now, if someone else maliciously (or this a strict liability law, no mens rea required?) exports it afterwards I can't see how it's your problem (or the university's, unless the other party was also part of the university). Does the law require this type of research to be kept secret or just not be exported? Is it an offence to recklessly or negligently reveal munitions research which might then be exported by others?