back to article Police face-recog tech use in Welsh capital of Cardiff was unlawful – Court of Appeal

In a shock ruling today, the UK Court of Appeal has declared that South Wales Police broke the law with an indiscriminate deployment of automated facial-recognition technology in Cardiff city centre. "The Respondent's use of Live Automated Facial Recognition technology on 21 December 2017 and 27 March 2018 and on an ongoing …

  1. Greybearded old scrote Silver badge

    Hold on

    They had PC Plod giving evidence on the performance of highly technical software? For those not familiar with UK ranking, that's out of training, but not made sargeant yet. Only a trainee is lower.

    How about getting a statistician in for job?

    1. Ben Tasker

      Re: Hold on

      Presumably anyone with a bit more experience under their belt refused to put their name to such claims, recognising that they didn't really have the means to back up that data.

      1. RayG

        Re: Hold on

        Certain malodorous substances are famous for travelling downwards. I have to have a little sympathy for anyone who gets shoved into the limelight because their superiors won't take the responsibilities they ought to.

    2. Mr Dogshit

      Re: Hold on

      Your assertion that someone at the rank of constable is somehow an inexperienced noob is entirely misguided.

      A police officer may spend their entire career at the rank of constable - it doesn't mean they're thick or unambitious. There is a very wide range of roles available at that rank. Perhaps an individual doesn't want to become sergeant. Perhaps they’re happy doing what they’re doing or don’t want the additional responsibility. Ranks in the UK police cannot be compared to the British Army. PC is not equivalent to Lance Corporal.

  2. gnasher729 Silver badge

    Interesting, I thought it was well-known that image based facial recognition worked less well for people of colour, due to darker faces and less contrast. But that should mean that they should be recognised less often than white people? So how does this facial recognition work? Does it "recognise" them more often and get it wrong more often?

    1. Charlie Clark Silver badge

      It's less accurate which means both false positives and false negatives: some innocents will be fingered for things they didn't do and some baddies will get away. Note, it's not just training data that's at fault, though that's the main issue, cameras may have to be optimised to make sure images have sufficient contrast.

      1. NoneSuch Silver badge
        Thumb Down

        "cameras may have to be optimised"

        Sir, the cameras do not need adjustment, they need to be destroyed. A false positive can result in you being detained in custody for up to two weeks without charges. All for the egregious crime of popping down to the shops. And if it happens once, it will happen repeatedly.

        Fun and games until you are the one being held behind bars.


        How long you can be held in custody

        The police can hold you for up to 24 hours before they have to charge you with a crime or release you.

        They can apply to hold you for up to 36 or 96 hours if you’re suspected of a serious crime, eg murder.

        You can be held without charge for up to 14 days If you’re arrested under the Terrorism Act.

        (End Quote)


        1. Charlie Clark Silver badge

          I'm not arguing for automatic face recognition systems like this to be used, just answering a question over their fallibility.

    2. Gonzo wizard

      Well now...

      The fact that they don't even know the degree of racial and/or gender bias is bad enough. Added to that, they've also had insufficient signage and essentially harassed people who put up a face covering when walking through a 'trial area' and well...

      It's really not a good look is it.

      1. Roland6 Silver badge

        Re: Well now...

        >The fact that they don't even know the degree of racial and/or gender bias is bad enough.

        However, to obtain this data they would have to retain all images captured and manually assess them...

        1. big_D Silver badge

          Re: Well now...

          Yes, that is what testing does, before it is put into public use. You know, making sure the damned thing works reliably, before putting it anywhere where it can get images of the general public and before it is actually used to "identify" people against mugshots in a live situation.

          1. Roland6 Silver badge

            Re: Well now...

            >Yes, that is what testing does, before it is put into public use. You know, making sure the damned thing works reliably

            However, showing from test data that "the damned thing works reliably" doesn't necessarily require any assessment of "racial and/or gender bias".

            Also from a project decades back, we found that the only reliable test data set was real-world data (think records on 40M+ individuals) as this contained many quirks which permitted assessment of omissions and bias. Fortunately, the client was an organisation with access to real-world data sets of that size, without that access the system would have gone live having only been tested against a few hundred records, many of which were manufactured specifically for testing...

      2. Mark192

        Re: Well now...

        With just 0.6%[1] of the population in Wales identifying as black, the correct answer to 'does it discriminate?' is probably 'In the field? We can't tell yet'.

        [1] Source: 2011 UK census

        1. Anonymous Coward
          Anonymous Coward

          Re: Well now...

          But in Cardiff the 'black' population is much higher, with a sizeable fraction of Somali heritage (and some others) in Tiger Bay

  3. Greybearded old scrote Silver badge
    Big Brother


    Not convinced it's a good idea to focus on the race angle so much. Moral fashions come and go. Auntie Beeb can't even report the use of that word this year, but in a few more that taboo will be replaced by another one.

    Or PC Plod will give evidence that the new version is accurate for all racial groups. (Whatever the reality.)

    We really need this case to succeed on the broadest argument possible, to protect the whole society from building the enabling technology for a repressive state.

    1. RM Myers
      Thumb Up

      Re: Hmm

      Exactly. What if the technology gets better, or at least the accuracy/inaccuracy is the same regardless of race or sex? Do we then say this technology can be used whenever and wherever the government wants? Using the race/sex issue as your main argument could come back to bite you later. The real issue is whether this type of intrusive surveillance is consistent with the privacy expectations of a democratic society.

    2. Blazde Silver badge

      Re: Hmm

      Most of the electorate aren't very absolutist when it comes to privacy. They want a repressive state for terrorists and other serious criminals, just not for the rest of us. So while the race/gender issue will probably narrow as the technology improves, misidentification in a general sense should still be a powerful part of the argument I think. In particular the idea the technology will inevitably improve to perfection needs to be defeated by those of us who understand the algorithms.

      In the extreme misidentification leads to Jean Charles de Menezes type scenarios, and it's going to happen constantly if the technology is widely deployed.

      1. hoola Silver badge

        Re: Hmm

        Simple solution, do away with Facial Recognition altogether and make it mandatory to have a QR code, barcode or RIFD tag somewhere that is easily read.

        I jest but before long we are going to be in the situation where some form of easily read ID card is required if people continue to take the piss out of contact details for Track & Trace.

        I don't like these automated "recognition" systems that are used as a broadband "lets try to catch people" tool. APNR is bad enough but it does actually serve a useful purpose particularly with the reductions in police.

        Once this become acceptable then you cannot undo it. It needs to be sorted out that facial recognition cannot be used in this generic way by the police or private companies. How you actually enforce it I don't know, particularly in the private sector. Maybe a solution for the police is to require something like a warrant to be able to run existing footage through when a crime has been committed or to set stuff up if it is surveillance.

    3. 96percentchimp

      Re: Hmm

      Auntie Beeb can't even report the use of that word this year, but in a few more that taboo will be replaced by another one.

      Auntie Beeb CAN report the use of the word. They can't use the word, repeatedly, in the report. FTFY

  4. Kevin Johnston

    Chief Constable says the judgement...

    "points to a limited number of policy areas that require this attention"

    That's fair, if there are 200 policies and you do not know which are affected you cannot say that is limited. If there are 200 policies and ALL are affected then it is just one block, containing 100% of the policies.

  5. Warm Braw

    Not convinced it's a good idea to focus on the race angle


    to protect the whole society from building the enabling technology for a repressive state

    The whole point of repressive states is to repress some people more than others. If the technology is effectively choosing who gets targetted that's actually a different problem to state actors making the choice (though it may stem from the same cause and lead to the same outcome) and needs to be fixed by different people.

    1. Greybearded old scrote Silver badge

      No, I'm talking about the sort of state that watches everybody. (See that Big Brother icon?) East Germany was a good example, they would think they were in heaven these days.

      Protect the currently profiled groups by protecting all of us, and it still works when the sons of many fathers change target.

      1. big_D Silver badge

        And one of the reasons why Germany has very strict rules on video surveillance.

        Doorbell cameras are a good example: they can't be used if they can see the street, see the pathway to the house or are put in communal areas of multi-residence housing (i.e. can't be used if it can see the hallway outside a flat that non-visitors to the specific flat have to pass.

        The same for surveillance cameras, they cannot record the pavement or road or public areas of the property (E.g. driveway), but you can use them in the back garden, where you would not expect members of the general public to be passing.

        Drones are similarly restricted. They can't be flown over housing or business areas. They can't be flown in any built-up areas really. They can be flown at model aircraft aerodromes, open fields or woods. But you cannot film anyone without their permission.

        If you want to use a drone in a built-up area (E.g. promotional film for a company, documentary or a feature film), you can apply for a filming permit, but you still need a release from anyone captured on film, before you can upload it to the Internet or otherwise distribute it.

        1. big_D Silver badge

          As an aside, one of our neighbours is a pest. She spends all day hanging out of the window shouting abuse at passers-by and the people who live in the street.

          One of her neighbours put up a "camera" which pointed at her window. The police came the next day to have a word with them. They pointed out it was just a dummy, not a real camera, but the police still said they had to either change the direction it faced, so that it was pointing at their property or they would have to take it down.

        2. john 103

          However, Its worth pointing out that the German forests are riddled with Game Cameras put up by hunters.(Source Peter Wohlleben & he should know)

          There was a famous case of a Bavarian politician caught in delicto ...

          That's despite these Game Cameras being illegal according to German law!

  6. mark l 2 Silver badge

    Although the win for Liberty in this case is good for everyone's privacy, I fear that if it requires legislation to enable police forces to deploy facial recognition in public, This current majority tory government they will push through a new law to make it legal on the usual ground of to protect the public against terrorists and serious criminals.

  7. Anonymous Coward
    Anonymous Coward

    Big Boyo

    is watching you.

  8. IGotOut Silver badge

    weasel words

    "a full review of the legislative landscape that governs the use of overt surveillance."


    Putting a sign saying "We are recording" after they have already got your image or "We are going to be recording at xyz venue" is not overt. It's restricting your freedom of movement without being recorded with no opt out.

    1. Roland6 Silver badge

      Re: weasel words

      >Putting a sign saying "We are recording"

      Well there are two issues here.

      Firstly the use of CCTV, which is typically recorded and facial recognition image processing which isn't - unless you are wanting to train the system or gain metrics on the accuracy of the system.

      It is thus debatable whether the facial recognition system is actually 'recording' if the video feed is being processed in real-time.

      We should also remember that to arrest someone, the results of the facial recognition processing have to be presented to a human, the question arises as to when the human compares the output to the actual target list and so is able to rule out many false positives before someone is actually stopped.

      1. 96percentchimp

        Re: weasel words

        "We should also remember that to arrest someone, the results of the facial recognition processing have to be presented to a human, the question arises as to when the human compares the output to the actual target list and so is able to rule out many false positives before someone is actually stopped."

        Unfortunately, humans are notorious for accepting the infallible authority of machines - AKA the "computer says no" fallacy, even when their decisions are obviously wrong.

        1. genghis_uk

          Re: weasel words

          This is my ongoing concern about the Police being more and more reliant on technology to the detriment of real policing.

          In a free society there is the assumption of innocence unless proven guilty (note: not 'until' as that assumes the person will be found guilty). However, we have a situation where people put undue trust in the machines so if the computer says this person matches, then the assumption is that they are guilty.

          Now the onus shifts to the innocent having to prove it because a computer tags them. This is not only an imposition for someone who is entirely innocent but it goes against a basic ethos of law - [that] It is better that ten guilty persons escape than that one innocent suffer (William Blackstone) - Unfortunately, a succession of authoritarian governments have pretty much inverted that one!

  9. Anonymous Coward
    Anonymous Coward

    "Porter has been something of a thorn in the side of government figures through his insistence that the public sector obeys its own laws and regulations in a transparent and accountable manner when deploying CCTV and related technologies."

    Well it's no surprise that this fascist government are squeezing him out.

  10. Twanky

    Great timing!

    The courts may be seeking to apply the law on use of face recognition technology - but the timing sucks. What a wonderful time to be working in this area when world just became more interested in how many layers of mask one should wear.

    Citizen! Over here! Straighten your mask. We could not read the QR code on it.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like