back to article UK lawmakers say live facial recognition lacks a legal basis

A UK committee in its upper house has written to Home Secretary James Cleverly to warn of the lack of legal basis for the use of live facial recognition by police. The House of Lords' Justice and Home Affairs Committee told the Conservative member of parliament that Live Facial Recognition technology (LFR) — which compares a …

  1. Anonymous Coward
    Anonymous Coward

    "Does the use of LFR have a basis in law?"

    Maybe not yet but that's easy to fix without making waves or incurring costs. After all, Rwanda will soon legally be a safe country for refugees regardless of any facts.

    1. codejunky Silver badge

      Re: "Does the use of LFR have a basis in law?"

      @AC

      "After all, Rwanda will soon legally be a safe country for refugees regardless of any facts."

      Economic migrants

      1. codejunky Silver badge

        Re: "Does the use of LFR have a basis in law?"

        FYI not a downvoter. I dont see anything particularly wrong in your comment bar that one fix

      2. Anonymous Coward
        Anonymous Coward

        Re: "Does the use of LFR have a basis in law?"

        Expats.

        1. codejunky Silver badge

          Re: "Does the use of LFR have a basis in law?"

          @AC

          "Expats."

          They could consider themselves vacationing, they are still criminals for breaking the law

          1. Anonymous Coward
            Anonymous Coward

            Re: "Does the use of LFR have a basis in law?"

            Expats are notorious for breaking the law. Especially tax, car import and residency registration laws. The crims.

    2. gandalfcn Silver badge

      Re: "Does the use of LFR have a basis in law?"

      Fascists love fascisting.

  2. BebopWeBop
    Facepalm

    I can be confident

    That any recommendations will be ignored but police forces in England and the government. Other findings of unlawful behaviour in this area have been.

    1. cyberdemon Silver badge
      Holmes

      Ignored not just by police forces

      Private companies have been building (and sharing) LFR databases for years. And in those cases it's not just those on predefined watch lists who are tracked, it's anyone and everyone.

      1. John Brown (no body) Silver badge

        Re: Ignored not just by police forces

        And schools. Don't forget schools. I first saw this in action a good few years ago. The CCTV system was live tracking and drawing green and red boxes over detected faces. I didn't ask, but my assumption was that green boxes were faces either properly identified as faces, or possibly had identified the specific person, red for those it either wasn't sure was a face or it had not identified, not sure how advanced it was, but I couldn't think of an obvious reason for it to be boxing in faces in both red and green unless it was identifying specific people, looking for people not in the database.

  3. Jimmy2Cows Silver badge
    Stop

    Ban it 'til it works

    LFR use anywhere should be banned until it is 6-sigma accurate, doesn't have any baises (racial, women etc,), and is not affected by variations in lighting and weather. Can't do that? Get lost.

    Cops and spooks don't care about any of this, however. They'll happily ignore any such mandates and carry on as usual.

    1. elsergiovolador Silver badge

      Re: Ban it 'til it works

      The rich can ensure their faces are excluded from any training data. Given that they typically eat well, have good routines and magnitude lower levels of stress their faces significantly differ from the rest. You can tell easily when looking at someone that they are wealthy (with some exceptions).

      So it is given that such system will have a bias in that area.

      Rich people pay good money so that none of their photos are ever available on the internet and if something pops up their agents are on it in an instant.

      1. Graham Cobb Silver badge

        Re: Ban it 'til it works

        The big problem isn't that your pic appears in the data. It is that the pic of someone who is not you but looks a bit like you is in the data.

        In that case, however innocent (or rich) you are you will be stopped at every street corner. Forever.

        And one day you will lose the dice throw and will be imprisoned for someone else's crime. Unless you are in the USA in which case you will just be shot by mistake, or while resisting arrest by trying to explain you are the wrong guy.

        1. elsergiovolador Silver badge

          Re: Ban it 'til it works

          That is my point. If it was not trained on the rich, if the rich person's photo that will be given to the network to find in the crowd will more likely match with a face it has been trained on or give some unpredictable result. It will have lower chance of finding the rich person.

          So someone will definitely be stopped at every street corner.

          The AI will be blind to rich people.

  4. Joe-Thunks

    As if it matters

    Britain is an exceptionally poor example of a democracy. It has always been so. The great unwashed ruled by the people who always seem to be in power.

  5. elsergiovolador Silver badge

    Lawless

    You need to remember that laws in the UK are for little people.

    Discussion is seemingly designed to distract from the main point:

    Who won the tender if there was one and why this has even been commissioned without any legal basis.

    Given that the country is run on brown envelopes, we probably know the answer.

  6. Andy The Hat Silver badge

    What are people worried about?

    '... "a resilient and highly accurate system" to search all databases of images the police can access.'

    The great bastion of democracy, China, has been doing this successfully for ages with no issues so why shouldn't we? I'm sure Hikvision have a few camera systems they can sell off cheaply.

    1. elsergiovolador Silver badge

      Re: What are people worried about?

      reporter> How do you find all the mass surveillance?

      Chinese person> I can't complaint actually!

  7. tyrfing

    The problems of comparing a live feed against a watch list have been discussed for a long time. Schneier has been talking about it for decades.

    If the list is small compared to the population in the feed, it's pretty easy to show that false positives will outnumber true positives - unless the accuracy rate is much better than anything we know how to do.

    The only thing it would be useful for is as a first-pass scan. It's basically an automated version of a policeman comparing everyone who passes a checkpoint against a set of photos, and flagging anyone who looks like one of the photos.

    If we had humans doing this, we would be more somewhat more reasonable about its value, since we know what humans are like.

    But "technology" (particularly "AI" these days) makes things seem artificially reliable.

    Combine it with jobsworth bureaucrats, and yeah, there probably will be people bunged up by mistake.

  8. Tubz Silver badge

    Don't worry, there is no chance of LFR wrongly identifying a person who ends up in prison for half there lives and then is refused a pardon or compensation.

    1. elsergiovolador Silver badge

      If you start writing letters to ITV and convince them to create a TV drama about your ordeal, then maybe you at least get someone to resign over it (just ensure it will air during the election year).

    2. John Brown (no body) Silver badge

      "Don't worry, there is no chance of LFR wrongly identifying a person who ends up in prison for half there lives and then is refused a pardon or compensation."

      ...and being charged for "board and lodgings"

  9. ldo

    Or They Could Just Pull Another Rwanda

    Sunak’s government could just pass a law declaring that facial recognition is in fact very reliable and works fine.

    1. John Brown (no body) Silver badge

      Re: Or They Could Just Pull Another Rwanda

      ...might as well go the whole hog and make Pi = 3 to make those calculations easier when building the new Whitehall revolving doors.

      1. Dr_N

        Re: Or They Could Just Pull Another Rwanda

        Isn't 22/7 already Pi-Lite?

  10. martinusher Silver badge

    Just as good as a human --- which isn't saying anything

    One of the takeaways from the Horizon scandal is that for some bizarre reason "the law" is inclined to believe computers just because they're computers. While this might be true for trivial situations -- simple adding up like a calculator, for example -- there's ample evidence and precedent to know that a computer's opinion is, well, just an opinion and without corroboration or other proof of correctness its just as likely to get things wrong as a human (or, more accurately, its human handler/programmer).

    So fretting about facial recognition is pointless. Its just the machine expressing an opinion -- it thinks to some statistical accuracy that the face it sees belongs to a particular person. Without proof its just an opinion. What needs pushback is the idea that just because its a computer its flawless and has to be believed without question. That's wrong in so many ways -- especially with complex, fuzzy logic tasks like this -- that we shouldn't even be discussing it.

  11. cantankerous swineherd

    this is bollocks, anpr doesn't have a legal basis either, it's not prohibited so it's allowed. no frenchies here, by god.

    don't get me wrong, I dislike anpr and faecal recognition alike, but the law isn't your friend on this issue.

  12. MrGreen

    Face Coverings

    How does it work with face coverings?

    Most CCTV of ferals show they have their faces covered.

  13. 0laf Silver badge
    Big Brother

    Old hat

    Facial recognition is bad enough but it's biomechanical tracking that worries me more.

    Facial recognician can be foiled by a hat or a scarf you can't do that if you are being tracked on how you walk.

    It's already in use in lots of places.

    The paranoid cynic in me wonders if facial recognition drama is bieng talked up to slip biomechanical tracking in through the side door.

  14. Charles Ghose

    Live Facial Recognition Ethics

    In my opinion, with the number of prison spaces severely decreasing in the United Kingdom and the British prison system under pressure due to overcrowding, using live facial recognition to capture even more criminals doesn't make sense. How accurate is this technology? Witnesses sometimes make mistakes in prison lineups, and there have been cases where facial recognition has erred or faced difficulties when presented with individuals from minority communities. Furthermore, could this technology of live facial recognition be implemented into the vast United Kingdom CCTV network to track known prisoners living freely in the community?

    Live Facial Recognition is based on a list of criminals who have already been caught, not on identifying new ones, which, in my opinion, contradicts efforts to rehabilitate these individuals. Should these ex-criminals always be marked as criminals for the rest of their lives? Will their images be on live facial recognition databases until the age of 100, similar to the lifespan of a criminal record in the United Kingdom?

    Considering the strain on the prison infrastructure and the potential inaccuracies and ethical implications of live facial recognition, it's clear that alternative solutions should be explored. It's crucial to prioritize rehabilitation and reintegration of individuals into society while ensuring that law enforcement methods are accurate, fair, and respectful of individuals' rights.

    1. Killfalcon

      Re: Live Facial Recognition Ethics

      That's a good point on the length it'll be stored: have you ever looked at old photos and been surprised how much a random great aunt looked like your sister does now?

      Might end up taking "the sins of the father" to some risky new levels.

    2. John Brown (no body) Silver badge

      Re: Live Facial Recognition Ethics

      "Live Facial Recognition is based on a list of criminals who have already been caught, not on identifying new ones, which, in my opinion, contradicts efforts to rehabilitate these individuals. Should these ex-criminals always be marked as criminals for the rest of their lives? Will their images be on live facial recognition databases until the age of 100, similar to the lifespan of a criminal record in the United Kingdom?"

      Good point, and why we need some sort of legal basis, oversight regulator and proper, standardised training. At the very least, faces of "known criminals" should be expunged from the database after all convictions are "spent", ie they have stayed on the straight and narrow (or at least not been caught since!)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like