LOL
Useless
A British tribunal yesterday ruled US selfie-scraper Clearview AI would not have to pay a £7.5 million ($9 million) privacy fine. The tribunal held that the Information Commissioner's Office (ICO), the UK's data regulator, didn't have the authority to fine ClearView, which scours the public web to collect images upon which it …
The original GDPR where the UK-GDPR came from applies to businesses wherever they are dealing with data from data subjects resident in the EU.
CPRA applies to businesses wherever they are dealing with data from data subjects resident in California.
The First-tier Tribunal (Information Rights) has just confirmed that if you take someone's data, store it outside the UK, and it ends up leaking or commercialised without your consent, nothing's going to happen.
Just in time for Palantar to slurp up NHS data and do what they like with it.
You're missing a few points:
If the data is obtained in the UK but stored outside of the UK, the entity doing so has to comply with GDPR for keeping the data secure and for its usage (same as with EU data).
If the data was obtained and stored in the UK, it can only be transferred outside of the UK under the same conditions: Must be GDPR compliant.
If the data was obtained from outside the UK - then UK law doesn't apply.
For the latter, I do mean were people have volunteered their data to sites outside the UK. That seems to be what the tribunal referred to when mentioning there was a legitimate reason for non-UK organisations to hold data on UK citizens and that would not be protected by UK law. An example: Data collected by US border control on UK citizens visiting the US. But NHS data collected in the UK is very much protected by GDPR.
The law applies because the biometric data belongs to "data subjects" resident in the UK. In the original GDPR:
Article 3, 2) This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or b) the monitoring of their behaviour as far as their behaviour takes place within the Union.
The UK-GDPR version reads like this:
Article 3, 2) This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to: a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom; or b) the monitoring of their behaviour as far as their behaviour takes place within the United Kingdom.
Similar. However the UK has this as well:
2A. In paragraph 2, “relevant processing of personal data” means processing to which this Regulation applies, other than processing described in Article 2(1)(a) or (b) or (1A).
If we look at Article 2 1) we find clarifies what happens before and after Brexit day and if we look at 1A) it seems to clarify that public bodies using manual filing systems fall within scope of UK-GDPR.
So clearly it doesn't matter where this data was collected, it matters where the data subjects are resident. Presumably originally to stop big tech arguing that data collection happens outside the EU.
NHS data is within UK-GDPR but this judgement has just affirmed that if that data is processed by a company outside the UK then the UK's jurisdiction doesn't apply. However when reading Article 3 2) it seems pretty clear that the ICO's jurisdiction does apply and there are ways for laws to be applied abroad through international agreements, one of my posts further down mentions one way this can be done.
I don't believe you final comment is correct. As I read the opinion of the Judges, if the data is being used by non-UK governments and/or for the purposes of law enforcement it does not fall under the remit of GDPR. Only if it's trawled by a commercial company is it subject to GDPR.
Once the data has been used for "law enforcement" or "governmental" purposes it is not clear whether it's free rein time and the data can be passed to all and sundry as the database presumably belongs to the governmental organisation that legally trawled it.
The FBI, CIA, NSA and Russian, Israeli and Chinese equivalents are all smiling at the wiggly contents of an open can ...
AC: "All laws are written in ways that benefit the rich and powerful."
Not entirely true, one of Tony Blair's biggest regrets (no I'm not discussing the invasion of Iraq) was the Freedom of Information Act, which allowed us plebs, but mostly annoying journalists, to obtain government information and then have the temerity to ask questions about why things were not as 'the people' had been told.
Every now and then a piece of legislation which actually benefits 'the people' gets through. That Magna Carta* set a dangerous precedent about diluting the absolute power of the Monarchy here in the UK, but don't hold your breath for the next one.
* JOKE ALERT: "Did she die in vain?' Hancock's Half Hour, BBC radio comedy
Odd then, that France has repeatedly fined them over the same issue.
Also, Palantir could cite the same defence and sell DNA data from their NHS slurp to US law/intelligence agencies and all that's going to happen is a bunch of judges in a tribunal in deepest darkest London will just shrug.
I've just had a read of the article on techcrunch. This bit
-- “The CNIL is questioning the way in which personal data is collected by the company, i.e. without any legal basis, by sucking up publicly accessible photographs on the Internet in order to feed its tool,” the spokesperson added. --
I find interesting. If its publicly accessible on the internet doesn't that mean that they can be accessing them from the US without being in the EU at all? Also whilst IANAL doesn't the fact that they are publicly accessible mean that anyone can access them? Not sure about being allowed to process them but the amount of junk mail (before the advent of spam) I used to get to my publicly accessible address indicates someone somewhere was processing somehow.
To my IANAL eyes it looks like France thinking it can get some easy money.
I find interesting. If its publicly accessible on the internet doesn't that mean that they can be accessing them from the US without being in the EU at all?
It doesn't matter, they're processing biometric data of people resident in France and selling it on, and not allowing subject access rights.
To my IANAL eyes it looks like France thinking it can get some easy money.
As well as Italy, Greece, the UK (before this nonsense judgement), Australia, Canada...?
France have fined Clearview and Clearview have not paid as France has no jurisdiction to impose the fine until Clearview either has an entity within French territory or a French entity starts working with them.
I would argue this is the difference between how the French and English feel laws should be interpreted rather than a failing of data protection laws when there is no jurisdiction.
The belief that "they're from abroad therefore we have no jurisdiction" is absolutely wrong. There are tried and tested ways to cooperate internationally, including bringing a case to the federal courts in the US and as there is the almost-similar CCPA now, there is little reason to reject a case just because the GDPR is from abroad.
Sorry for late response.
What you say is true in theory. It is unlikely to be tested in reality as arguing for extradition would likely amount to "no laws were broken in the defenders home country and the defendant has never entered the country where they will be prosecuted"
The counter-examples would be attempted extortion from corrupt countries. Not suggesting that France are corrupt, just the ease at which foreign countries laws can be ignored.
Let's see if France ever gets any money from Clearview but I expect the UK and French responses are practically identical.
"Did nobody here read the report?"
Apparently not. I'm not particularly familiar with the gdpr, but I have to say the finding is a bit disappointing. I would have assumed (probably without justification) that any law enforcement exemption would have applied in a much more focused manner - say, to suspects, witnesses, victims and other persons of interest to a specific investigation. I would not have thought that it permitted a random company to scoop up vast swathes of personal data belonging to people who have no connection to investigations, on the basis that it might be useful one day to their law enforcement clients. As a reason for processing / retaining personal data, that seems ridiculously broad and permissive.
Still, there doesn't seem to be much we can do about it for the moment, short of stocking masks at all times. It's time someone came up with a way of preventing facial recognition from working on any given image, though I'm buggered if I know how that could be done.
"are they scraping from the US, the UK or where the physical host is?"
Hmmm. I would guess that the location of the scraped material is irrelevant for these purposes. The bot, wherever it lives, is controlled by the US company - scraping doesn't happen until the company sets it running. Therefore, scraping happens in the US, for the duration of the bot's activity.
Imagine that the company created a website, hosting it in, say, Australia. The content of the site is illegal in the US, but not in Australia. Will US law enforcement bods come calling at the company? You bet they will! Of course, that could get you into complex arguments about when and where publication occurs - is it when the company uploads the content, or when someone views it? - but the company is going to be in hot water.
With respect to the GDPR matter, it's not so much the scraping that bothers me - it's public information, after all - as the subsequent retention and processing. Grabbing and storing personal information on the offchance that it might be useful to their client one day doesn't seem proportionateto me. Presumably, their law-enforcing clients will be hoping to identify villains by searching the retained data, yet there is every chance that most of the data subjects will never commit a crime, and if they do it won't be within the jurisdiction of the client, so most of that retained data is being kept on very flimsy grounds.
>” is it when the company uploads the content, or when someone views it?”
There doesn’t seem to be a single solution.
My view would be if the website is registered as being UK owned or domiciled according to either its domain name (.uk) or domain registration (.com etc operated by UK entity) then to access it you are effectively doing the online equivalent of walking down the street and looking into someone’s real world UK located property.
We read it. Maybe the reason we don't entirely buy that is that Clearview AI is not law enforcement. It may sell some stuff to law enforcement, but what it is is a commercial company that uses the data it collects (illegally) for profit. That's commercial use, no matter what end the product is used for. For the same reason, I can't open a camera factory and claim that, because the police used my camera when they needed to surveil a criminal, that I'm now exempt from laws because I'm law enforcement. Every other law would agree; my camera factory wouldn't get to avoid paying tax because I'm part of the government. My camera factory would not get to ignore local regulations because I work with a higher level of government. Clearview isn't granted any special treatment by the United States government, so to be granted that from the UK is stupid.
Agree the real issue is whether Clearview offering a service to say Oz law enforcement - so Clearview are both the service operators and data owners, is the same as Oz law enforcement running the same system “on prem” and thus wholly within their control. I suggest not, and thus the tribunal have got it wrong.
How does the firm invoke the Law Enforcement exemption? They’re not processing data as agents of foreign law enforcement agencies. They are processing data as a private company, that then (for now) offers services only to foreign (and maybe domestic) law enforcement agencies.
The database doesn’t “belong” to those foreign law enforcement agencies.
Did the ICO have especially poor lawyers on this one?
This post has been deleted by its author