
Re: Data Protection in photos?
An image of a face - perhaps a passport photo, mugshot, or snap of someone throwing a bottle captured by CCTV (IMG1) is analysed by software to create a (hopefully unique) identifier. If another image of a face (IMG2) is captured by a different camera, and analysed by the same software (or one using the same method of analysis), an attempt can be made to associate IMG1 and IMG2 and, if the unique identifier seems the same, assert "This is the same face and that belongs to this person who is wanted for [terrorism/kiddiefiddling/vandalism/taxevasion/beingblackinabuiltuparea...]."
That process can be automated - it doesn't rely on human eyes or interpretation.
Unfortunately, with the current state of this technology is not accurate. It's wrong more often than right. It also seems to be worse with individuals of certain racial facial features.
So when a nice police officer or three rushes up to you at a football match, and starts asking questions about crimes you know nothing about, or expressing suspicion, or wanting to search you - that may be the result of automated decision-making based on your personal information - which is at the heart of data protection. Isn't it?
If it starts to happen again and again - not only could it get tiresome very quickly, but it might be because "they all look the same" to the AI based on its training data.