Firstly PGP public keys are on the server and placed there by the key owners.
They can be uploaded by anyone who possesses them. Co-workers, anyone with whom you have shared the public key.
Cryptographic key servers are in "direct violation" of the EU's General Data Protection Regulation, a software developer has claimed. Michael Drahony (AKA yakamok) has written a program (on GitHub) designed to highlight the potential compliance issues posed by use of PGP as an email encryption utility. "Currently you cannot …
There’s some rather shonky logic in the commentary here, IMHO.
First, I'd have thought that the starting point is to work out who is the controller of the processing which takes place on each key server, on what basis the data are being processed, what rights apply to the data subjects, whether any exceptions apply, whether any exemptions apply, and so on. Without this, it's all a bit nebulous.
Similarly, the reference to "implied consent" sounds like a red herring, since consent requires a "clear affirmative action" by the data subject — it is either "consent" or it is not — and, in any case, (a) consent can be withdrawn at any time (Article 7(3) GDPR), and (b) the right to erasure, under Article 17(1)(b) expressly applies to processing done on the basis of consent, where that consent is withdrawn. So, even if "implied consent" is a thing, you can't argue "implied consent" as the basis of continued processing, in the face of an objection / request for withdrawal of consent.
Lastly, I’m not sure where the concept that “the right of erasure only applies where it is practical” comes from. The right may not apply where the request is manifestly unfounded or excessive (Article 12(5)), but that’s hardly the same as whether the deletion is “practical”.
I suspect we simply have here a situation in which those designing and operating the key servers did so — perhaps entirely reasonably, at the time — without considering this kind of issue.
Took the words out of my mouth. Any time someone says "implied consent", you can immediately disregard everything else they've said because there's no such thing. The same applies for "practicality". He's basically said "Oooh, you know what, I don't think I'll obey the law today because it sounds a bit tricky". That ain't how it works.
And whilst I am no crypto expert it should not be difficult to enforce expiry and deletion for publicly available keys across all key servers. Yes keys can be revoked, are they deleted from all key servers? (seriously I dunno) I could revoke one to find out or google, I will after I post.
Once an expiry date is reached one chooses to renew or revoke/delete public key.
Just a thought.. Always up for an education though :-)
They're hardly immutable. Upload a new version of your key and it will be merged with the older version. There has been discussion on the sks-devel list about mechanisms to permit removal. SKS (the most popular key server software) could be modified so that if the user who wants the data removed has control of the key they could upload some sort of signed,structured document revoking permission which could be propogated in place of the key and the merger of the two keys conducted in such a way that the key is replaced.
That doesn't solve the problem if the user doesn't control the key in the first place. There's nothing stopping me from uploading a key which I generated with someone else's details attached.
... the fact that the key servers contain your e-mail address would be a problem too.
There's a difference between providing your personal information to another party as part of a transaction and publishing the same information. If your doctor leaks your medical history, trouble will ensue. If you write a book about your battle with haemorrhoids you can't change your mind later and demand every copy be pulped or deleted from every e-reader.
The clue in "public key" is the word "public".
I think the crux of the matter here are the questions, "who is the data controller?", and "who is publishing the key?"
If you are publishing your own information for all to see, either there is no data controller, or you are the data controller. Expecting someone else to be responsible for removing the data would be akin to the example above of publishing a book containing personal details. Another analogy would be spray-painting your name and email address on a wall and then expecting the owner of the wall to clean it off for you.
If someone publishes a key containing your information on your behalf, that is a different matter. I think there is an argument to be made that if this was done prior to GDPR, and it is not possible to revoke the key, then tough. If someone does it now GDPR is in force, there is the argument that the person doing so is in contravention of GDPR and is liable for fines and damages.
"If you write a book about your battle with haemorrhoids you can't change your mind later and demand every copy be pulped or deleted from every e-reader."
Not a good analogy. If you use a service that requires some aspect of your personal information such as email or IP address, then GDPR says that it to be deleted on request. There are a few exceptions but in general thats the requirement.
So my email user name is associated to my public key. Well my real name is associated to my phone number and that's not something that is going to change, right ?
Let's not get distracted here. There's enough to do to secure our private lives already without going off on wild goose chases like this.
If I contact my bank and ask them to delete all references to the loan agreement I signed (and haven't yet repaid), they are not going to oblige, and the GDPR doesn't require them to. If I paid it off more than 6 years ago, or 12 years ago in the case of a mortgage, then that would be a different matter.
The key server exists to prove that documents I signed do bear my valid signature, so presumably that has to be retained for the same reason. I could for example have signed the aforementioned loan agreement using such a signature.
Or, it can be:
For example, at work, I have a personal email address that reaches just me. There there are various group email addresses that reach me plus other people, for example, there is one for all the women in the company (and there is another for all the men), then there is one for everyone in my department, then there is one for all the managers, and there are ones for various projects which reach all the people across different departments who work on that project.
"I've seen it used in my own work history in conjunction with messages about the male or female toilets in the building..."
And the let's discriminate in favour of women because our market rate is less and fewer of us have management positions type groups.
<quote>Just curious what kind of company needs an email group for all the women and for all the men?
I've never heard of such a thing before.</quote>
One such possible use is to direct appropriate messages about sexual harassment in the workplace to the appropriate audience.
> consent requires a "clear affirmative action" by the data subject
Uploading your key to a keyserver and requesting it to be published is pretty clear affirmative action.
The problem is when someone else uploads your key without your permission - or worse, a different key which claims to be yours.
That is why I don't use keyservers: anyone can upload any random key with any random label. There's no assurance it's the right one, unless (a) I got the fingerprint from a trusted source (in which case I could have got the key from that source too); or (b) the key happens to be signed by someone in my web-of-trust, which is pretty small.
Therefore, in general I get keys directly from whoever I'm corresponding with: it's much easier to make a value judgement over whether it's the right key or not.
Back to GDPR: there is an assumption baked into PGP that public keys are, well, public. Simple answer: get rid of keyservers. These days you can publish OpenPGP keys securely in DNS/DANE instead.
One other thing: can anyone give me a good reason why a keyserver should *not* remove a key on request, if the request is signed by that key?
In which case, see standard process for someone who releases something into the public domain without your consent; liability lies with them, not the key server. Once in the public domain, there's not a lot that can be done.
I could have a local cache of your key recorded from the key server, which I did on the understanding you'd consented to typical use of the pgp system. Is the onus on me to monitor the key server for status changes?
"Uploading your key to a keyserver and requesting it to be published is pretty clear affirmative action.
The problem is when someone else uploads your key without your permission - or worse, a different key which claims to be yours."
There's another problem: as per the GDPR, consent can be removed later on.
"One other thing: can anyone give me a good reason why a keyserver should *not* remove a key on request, if the request is signed by that key?"
I really agree with that, and it definitely sounds like something that could be automated easily.
There's the murkier case of when the private key is lost, but that should be much rarer, at the very least.
It would be easy enough for the person running a key server to implement removal of an e-mail address (and all data linked to that e-mail address) by anyone who can read e-mail sent to that address: enter your e-mail address here, then click on the link in the message that you will receive shortly. There's a risk that someone could get the data deleted by impersonating the owner of the e-mail address, but how often might that happen and how bad would it be when it does happen?
Is there also a problem of servers propagating data to each other so that data can never be completely removed? There are other reasons besides GDPR why systems should not operate in that way. You've also got defamation, copyright, child abuse images, official secrets and various other things to worry about.
Bearing in mind it's been 18 years (holy sh!t I'm getting old!) since I last used PGP in anger, way back then some keyservers would allow you to delete your key, but it was an experimental feature, not supported by the main ones (MIT?) and there was a risk that replication with another server would result in it coming in again. Thus turning it into a one-sided whack-a-mole contest.
As another poster has said, part of the point of PGP is non-repudiation. If I an erase all record of my key being used to sign something, I can then deny I signed it, which means what's the point of signatures?
In practical terms, what's the percentage of signed email these days? I think I've had maybe 2 signed emails in the last 10 years and they were S/MIME. It's almost as if people don't care about privacy at all.
> As another poster has said, part of the point of PGP is non-repudiation
The non-repudiation aspect of PGP does not depend on the existence of keyservers.
The signature itself is inherently bound to a public key.
In order to verify the signature, then you need a trusted copy of the correct public key. Some random public key you find on a keyserver is not worthy of any trust, unless it has been signed by someone you in turn trust as an introducer.
The keyID is worthless. It's something which *may* look like an E-mail address (it does not have to), but has not in any way been confirmed to correspond to that E-mail address.
Therefore: if it is important to you whether a specific signature is valid or not, then it's up to you to possess the correct public key which corresponds with the private signing key.
Unless it's signed by someone you trust, you should go to some trouble to ensure that the public key is the right one (e.g. by having a phonecall and exchanging fingerprints). Then you can sign the key yourself to remind yourself in future that you verified it.
"The non-repudiation aspect of PGP does not depend on the existence of keyservers.",
This is a really good point, as the key-server can not provide non-repudiation they provide nothing more than a form of basic storage. So if a subject withdraws consent to store their data they don't effect any necessary processing, and the request should be honored.
If somebody posts personal data about John Johnson on the blockchain, say for Bitcoin, how can John Johnson get it deleted ? This is also not possible.
I heard rumors that somebody who dislikes Bitcoin at some point upload a child pornographic image to the blockchain, basicly making the blockchain illegal material in most countries. Could have been government actor.
We had slighlty mixed messaged on that.
On one hand you've got "required to function" granting exception, and a clause (somewhere, can't find it right now) pharsed as "…taking account of available technology and the cost of implementation, shall take reasonable steps, including technical measures".
On the other hand, you've got the ICO publishing things (I think it was clarification statement about backups) that say "technical difficulties doing it aren't an excuse".
That's not even an exception: "processing is necessary for the performance of a contract [with you]" is explicitly one of the six ways of lawfulness. Consent is only needed if you don't have a proper reason to have the data in the first place, and where no consent is needed, it cannot be withdrawn.
It would be interesting to take a look at termination clauses of keyserver T&Cs; and it might be possible that a contract you cannot get out of under any circumstances is illegal (under general consumer laws, for example), but that wouldn't be specific to GDPR.
An email address is unique to a person. "It's still personal data even if you can't find out who is the person behind it," Grooten added.
This is bollocks. I suspect the quotee is getting confused by the notion of a pseudo-anonymous key, such recording a user's identity with a serial number. If the organisation can de-anonymise it by looking up Sir Arthur Streeb-Greebling of 23, Acacia Gardens in another table or database, then the record with the serial number is considered PII. If the ID number is entirely random and there's no way to lookup the number and find the human identity it relates to, it's not PII for the purposes of GDPR.
" If the ID number is entirely random and there's no way to lookup the number and find the human identity it relates to, it's not PII for the purposes of GDPR."
This isn't entirely correct either. GDPR cares about whether an individual is identifiable, rather than whether the key can be looked up. Done properly and in the right contexts, pseudonymisation is a critical component in most data privacy designs. However we all know that even without a concrete identifier, aggregation of sufficient data attached to the otherwise pseudonymous key can make retrieval of the actual identity trivial. This is particularly so given that GDPR makes it clear that it's whether *anyone* could reasonably reconstruct the individual's identity, not just you as the data processor at that time.
There are two major forces being pushed in information regulation:
1) Unlimited transparency into online activity for government agencies.
2) Enhanced privacy for individuals as far as the general public is concerned.
The only way these two coincide is if the governments start either holding or mass-caching/backing-up much, much more of the data that's online. There are some fairly dystopian outcomes to be considered in those directions.
Hi i am the author of the PoC program for uploading data to key-servers, thank to the people responding to the article. For those interested i have created a document on the same github as advertised in the article highlighting the issues with key-servers and the GDPR.
I would like to open source this discussion on github:
If you have researched the GDPR and have valid points of the GDPR you would like added please make a pull request or reply here and i will add it for you.
You can also direct message me on twitter with the sections of the GDPR you think are relevant to this case to me, i will credit if requested each section added, to the person who submitted it.
""Currently you cannot remove data from the key servers on request..."
This is, of course, fundmental to #GDPR. And about 99.999...% of businesses are currently in non-compliance.
Basically there should be a user driven process that allows them to select "delete all data" and a confirmation is then sent that all personally identifying data has been deleted.
It's not difficult, so what's their problem?
Seems a bit odd though to start with PGP as the whipping boy.
Implied consent is far too broad of a term to make an assumption with. Yes, and assumption, because for a case like this, implied consent has never been adjudicated in the courts.
..and just because you place something into the public domain, doesn't automatically presume implied consent. Anyone who has taken a high school law class can get this question correct.
You park your car on public streets; therefore, implied consent says anyone can take it after you leave.
Sounds good, right? But obviously this isn't the case.
You'd think a professor could take 10 minutes to think this out and realize how wrong they are.
Another example... you put your garbage on the street, so now anyone can go through it and grab any old documents and other personal items you tossed out for themselves.
Again, sounds good, but in most countries... doing so is still considered stealing. Tossing something out doesn't give 'implied consent' that anyone can take it and use it.
So, once again... another so called security professional at a university who went into education because they couldn't actually perform well on the job. If you can't do, might as well teach.
People are forgetting consent is only one of several legal reasons for prosessing personal information. Id argue consent isnt the reason for this processing (sharing it across data controllers), but legitimate business interest is. Then the argument about gdpr requiring cintrollers to handle rewuests where consent is withdrawn is entirely moot.
There’s still an issue with requests for deletion, but the law doesnt grant the right to have your data deleted, just the right to request that your data be deleted. The way this system works makes deleting completely impratical when you have thousands of different controllers sharing the data, so this should be easy to justify not deleting.
I think The Reg could haave covered this a bit better!
What is consent? Much like cyber-law and even jurisprudence-lite AI law the only thing that is defined is the invoice for billable time to say we don't know. IANAL but you never hear the corollary. A packet, third-party proxying, bulk data collection, port-mirroring, cryptography all on very shaky ground if you have the tools and the stack expertise. If you read the full GDPR, its 99% about money and 1% about people. Silly. #SpyingisIllegalontheGeneralPublic
Implied consent is no longer a thing.
It was under DPA1998 (95/46/EC) but under GDPR (Regulation (EU) 2016/679) which obsoletes it it is not.
Article 4(11) states “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
The right to Erasure only applies where consent is the leagl basis for processing.
In this case it could be argued that the basis is Art6.1(e) "processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract"
where the contract is between the submitter and the key server, but it does raise the issue where someone does not have the right to submit the information.
INAL and i think this needs to be settled by them, and build up some case law, anyone fancy being Max Schrems for this one?
But its enrichment data that can be used to identify someone. firstname.lastname@example.org may not be a direct identifier, nor is ipaddress of smallcompany.com, but if you reverse lookup the ip to small company, then ring and ask to speak to the sales manager, you just personally identified someone from two pieces of enrichment data. And this happens with most pre-GDPR lead management systems.
Biting the hand that feeds IT © 1998–2020