back to article Metropolitan Police's facial recognition tech not only crap, but also of dubious legality – report

Facial recognition technology trialled by the Metropolitan Police is highly inaccurate and its deployment is likely to be found "unlawful" if challenged in court, an excoriating independent report has found. Researchers from the Human Rights, Big Data & Technology Project, based at the University of Essex Human Rights Centre, …

  1. Aladdin Sane
    Mushroom

    Repeat after me: Orwell's 1984 was a warning, not a how to guide.

    1. Sgt_Oddball Silver badge
      Big Brother

      Unfortunately....

      We're swaying more towards Terry Gilliams Brazil..

      At least we don't have to sign for arrests in triplicate yet.

      1. macjules Silver badge

        Re: Unfortunately....

        All we need to do now is to appoint several hundred Metropolitan Police officers to be able to pick a random letter out of a scrabble bag.

        Unfortunately I think that we might only get, "What do you get if you multiply eight by seven?" as a result.

      2. Danny 2 Silver badge

        Re: Unfortunately....

        Katherine Helmond died this year (star of Brazil and Soap).

        Brazil was a brilliant movie that lost a lot of power when Jean Charles da Menezes was shot dead for the crime of looking Arabic aka brown.

        When real life becomes worse than satire then satire is deflated. Accidental Death of a Brazilian.

        [Edit: Oh, and I posted this before I read the thread and saw other people name checking him. Poor guy, could have been any of us]

  2. batfink Silver badge

    Help with "Innovative Solutions"

    I'm sure the Chinese Government could help the Met Police with a few"Innovative Solutions for making London safer", based on their own experience in some of their Regions.

    Whether those living in London might actually be happy with the "Solutions" is another question entirely.

    1. Yet Another Anonymous coward Silver badge

      Re: Help with "Innovative Solutions"

      You can't just go around interning people in some rebellious province based purely on their religion.

      That's the trouble with China, just going around copying other country's innovations

    2. Anonymous Coward
      Anonymous Coward

      Re: Help with "Innovative Solutions"

      I always feel safe in China, I'm a middle aged white man with a stubbly beard,

      to their facial recognition software I probably don't exist, their models will be trained on data sets severely missing white guys like myself

      1. RogerT

        Re: Help with "Innovative Solutions"

        I've always said I'm happy for the Chinese to spy on me... providing they don't sell the information so that it reaches a UK Government.

        1. doublelayer Silver badge

          Re: Help with "Innovative Solutions"

          You may be happy for the Chinese to use your data, but maybe you'll change your mind when you figure out that they can use your data to help improve the technology they use to commit massive human rights abuses on someone else. Consider this (audio), for example. That's what they can use data for, and it can come here once they've perfected it and on the way used it to imprison and kill thousands and eventually millions of innocent people. Are you still fine with it?

    3. katrinab Silver badge

      Re: Help with "Innovative Solutions"

      The UK are the world leaders in CCTV deployment by a very wide margin.

      1. phuzz Silver badge

        Re: Help with "Innovative Solutions"

        Assuming you count all the corner shops with a twenty year old, black and white, camera tucked away behind the crisps, recording onto a 30min VHS tape that snapped two years ago but no-one has noticed yet.

        The UK government wishes they had as many working, networked, CCTV cameras as they said they did.

        1. Yet Another Anonymous coward Silver badge

          Re: Help with "Innovative Solutions"

          >The UK government wishes they had as many working, networked, CCTV cameras as they said they did.

          Well they do all breakdown everythime there is a protest march in London,

          Occasionally retroactively if there is any accusations against the police

          1. Fred Dibnah

            Re: Help with "Innovative Solutions"

            The cameras also stop working when there's an innocent Brazilian electrician they want to kill.

            1. Nick Kew

              Re: Help with "Innovative Solutions"

              An innocent Brazilian in a world where police didn't have facial recognition. Surely he of all people would've stood to benefit from any alternative technologies they might have had, that could've caused them to act differently (no matter *what* difference) on that day.

              Surely what matters with such cameras is what they do with the information. Isn't the most likely usage (for the foreseeable future) to alert a human to look at such-and-such?

              1. BristolBachelor Gold badge
                Meh

                Re: Help with "Innovative Solutions"

                "Surely he of all people would've stood to benefit from any alternative technologies they might have had"

                My reading is that with this technology, they would've shot 42 people instead of only 1, but only 8 of the shot people would've actually been of interest. (Actually shooting anyone is another topic)

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Help with "Innovative Solutions"

                  But terrorists is tricky, many of them aren't black so can slip by the police undetected

                2. veti Silver badge

                  Re: Help with "Innovative Solutions"

                  How many of these 42 people were shot, exactly? How many were arrested?

                  How many even knew that they'd been "identified"?

                  1. Rich 11 Silver badge

                    Re: Help with "Innovative Solutions"

                    How many even knew that they'd been "identified"?

                    Presumably they knew they'd been identified by some means when a copper came up to them and said, "Mr X? I'd like to speak to you about your outstanding fine / latest heist / Great Escape." But for whatever reason they still weren't worth arresting.

                    1. Prst. V.Jeltz Silver badge

                      Re: Help with "Innovative Solutions"

                      My reading is that with this technology, they would've shot 42 people instead of only 1, but only 8 of the shot people would've actually been of interest. (Actually shooting anyone is another topic)

                      No , no they wouldnt be shot . Everbody gets up in arms about this because they think the police are going to release ED209 into a crowd to gun down whatever it wants. I would suggest the system show a copper two photos with the caption I think this guy is this guy and then if the copper agrees - they take some action. If the system scanned 10,000 face at Notting Hill and made 42 suggestions of which 8 were correct thats pretty fucking good going I dont think a copper stood watching the crowd on his own would get 8 results.

                      Presumably they knew they'd been identified by some means when a copper came up to them and said, "Mr X? I'd like to speak to you about your outstanding fine / latest heist / Great Escape." But for whatever reason they still weren't worth arresting.

                      Again , no. They wouldnt know theyd been scanned because the copper would have veto'd the machine if he didnt think the 2 pictures were the same guy.

                      If he did , and it wasnt , then fair enough Inocent guy looks near identical to a wanted crim - what can you do? you need to see his I.D

                      1. Prst. V.Jeltz Silver badge

                        Re: Help with "Innovative Solutions"

                        i expect many downvotes .....

                      2. Anonymous Coward
                        Anonymous Coward

                        Re: Help with "Innovative Solutions"

                        "If he did , and it wasnt , then fair enough Inocent guy looks near identical to a wanted crim - what can you do? you need to see his I.D"

                        What is this I.D of which you speak? I am not aware that I am required to have any or, indeed, to carry it with me.

                      3. SloppyJesse

                        Re: Help with "Innovative Solutions"

                        " If the system scanned 10,000 face at Notting Hill and made 42 suggestions of which 8 were correct thats pretty fucking good going I dont think a copper stood watching the crowd on his own would get 8 results."

                        And there is the exact reason this is not the way to test the effectiveness of this technology.

                        We do not know how many valid targets there were in the population checked.

                        What they should be doing is recruiting a bunch of volunteers, putting them (and only them) into the system and then sending them into a crowd. Then we'd be getting sensible information to judge effectiveness.

                        1. katrinab Silver badge

                          Re: Help with "Innovative Solutions"

                          My understanding is that it made 42 suggestions of which 0 were correct.

                          The facial recognition system has only once in recorded history correctly identified someone on the database, and that person shouldn't have been on the database as they were no longer of interest to the police.

                      4. ibmalone Silver badge

                        Re: Help with "Innovative Solutions"

                        Or you have the case of Steve Talley https://web.archive.org/web/20190518102457/www.copblock.org/152823/denver-police-fck-up-again/ https://denver.cbslocal.com/2016/09/15/former-financial-advisor-wrongly-accused-of-bank-robbery-fights-to-win-life-back/

                        In short, he was arrested twice, once with extreme force by a SWAT team, and spent months in prison. But the computer said it was him, so tough luck.

                        In the case of Charles de Menezes the police seem to have been operating under such a state of hysteria that you can quite easily see a dodgy facial recognition match leading to a shooting, it's not at all far off what actually happened.

                        1. Woodnag

                          Re: Help with "Innovative Solutions"

                          It's not just that Mr Menezes was murdered that's the problem, but the lies about the circumstances to make the guy appear suspicious that were instantly shovelled out and regurgitated by the press.

                          The Met really doesn't suffer from much accountability. The UK lost "S and Marper v United Kingdom" 11 years ago and still haven't deleted that illegal DNA database.

                      5. Loyal Commenter Silver badge

                        Re: Help with "Innovative Solutions"

                        Everbody gets up in arms about this because they think the police are going to release ED209 into a crowd to gun down whatever it wants.

                        I think you may have a bit of a straw-man argument there. The actual impact of falsely identifying someone as a suspect, especially if it happens repeatedly, should be obvious. How would you like it, if you happen to be on your way home from work, flagged up as a suspect and arrested. Sure, you'll get released again. It's not exactly convenient for you for this to happen though, is it.

                        For a case in point, why not ask the guy who made the news in Bristol, who was tasered outside his home, because some particularly overzealous plods from Avon and Somerset Constabulary thought he was a wanted suspect. Particularly embarrassing for them, since the guy in question was actually a community police liaison bod for the black community in St Pauls, which kind of begs the question, "did you just see a black guy with dreadlocks and think, 'We're looking for a black guy with dreads', and nab him?"

                        And that's without havign a computer make the wrong decision for you.

                  2. Nick Kew

                    Re: Help with "Innovative Solutions"

                    How many even knew that they'd been "identified"?

                    Judging by the comments here, about 318276 of the 42.

              2. really_adf

                Re: Help with "Innovative Solutions"

                "Surely what matters with such cameras is what they do with the information."

                Absolutely, but in general people seem to trust what computers say more than I think they should.

                Yes, facial recognition may have prevented the tragedy in Stockwell, but the concern due to the above is how to ensure it doesn't end up causing more such tragedies because "computer says he's armed and dangerous".

                Unfortunately, I fear the answer will come too late for some, but research like that reported here offers some hope that fear will not be realised.

      2. Allan George Dyer Silver badge

        Re: Help with "Innovative Solutions"

        "The UK are the world leaders in CCTV deployment by a very wide margin."

        China is intent on surpassing the UK very soon. On a trip last year I saw truckloads of new CCTV cameras being installed.

      3. Fruit and Nutcase Silver badge

        Re: Help with "Innovative Solutions"

        The UK are the world leaders in CCTV deployment by a very wide margin

        The introduction of "Smart Motorways" where they repurpose the "hard shoulder" (emergency lane) as another traffic lane and monitor the road with continuous CCTV coverage for stranded vehicles means for long stretches of road (12 miles, spanning several junctions in the case of the M3) you are under constant surveillance. Couple that with ANPR and real-time mobile network mast meta data...

  3. Anonymous Coward
    Anonymous Coward

    It's in its infancy, but it will improve

    The knuckle-dragging Luddites always get in a froth when new technology is applied.

    * You have a personal tracing device in your pocket RIGHT NOW (your phone).

    * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

    * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

    * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

    * You are happy to be tracked RIGHT NOW (advertising)

    If you are happy with all of those (and it seems you are given the up-take), then why are you getting your gusset in such a twist of over the Met applying technology to public safety? Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on.

    If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers.

    The technology still needs to advance, but once it have it will be a MASSIVE BOON to society by helping us to identify risk individuals before they have would have become known by traditional means. More importantly, it will help prevent the police from wasting their time and innocent people who happen to be "the wrong colour".

    Get your heads out of your collective bum.

    1. Paul Hargreaves

      Re: Get burned?

      > 30 seconds out of your day to provide ID and carry on.

      Just to note, assuming you're carrying ID. What if you're not?

    2. Pontius

      Re: It's in its infancy, but it will improve

      Troll, Useful Innocent or Useful Idiot? So many words, so little sense.

      1. Androgynous Cupboard Silver badge

        Re: It's in its infancy, but it will improve

        5 bulletpoints, 4 clearly wrong and one (phone tracking) subjective. Repetition, overuse of capitals and, inevitably and somewhat ironically, anonymous. I considered troll but on reflection I am leaning towards idiot

        1. Jimmy2Cows Silver badge

          Re: It's in its infancy, but it will improve

          Trolliot? Iditroll?

          1. TimMaher Silver badge

            Re: It's in its infancy, but it will improve

            Maybe Idolitroll?

            A: Somebody who idolises trolls.

    3. smudge
      Big Brother

      Re: It's in its infancy, but it will improve

      Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on.

      I doubt if the family of Jean Charles de Menezes will share your confidence.

      But even if you don't get shot, you could certainly get into trouble that would take a lot longer than 30 seconds to get out of.

      If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies

      It will make eff all difference to stop and search. You don't have to be on a watchlist to be stopped and searched. The police officer merely has to have "reasonable grounds" to suspect that you are carrying something dodgy. AI facial recognition will not affect that in the slightest.

      Icon of BB to give you a thrill.

      1. Scroticus Canis
        Meh

        Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.

        The failing there was the good old Mark One Eyeball of the current Met. Commissioner, who was the "Gold Commander" on that botched op., and her underlings. Sweet FA to do with automatic facial recognition.

        So your point mentioning him was?

        1. Richard 12 Silver badge
          Facepalm

          Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.

          Better hope the computer doesn't say you're it, or you're dead too.

          People have a tendency to believe what the computer says, even (or perhaps especially) when it's blatantly wrong.

          1. veti Silver badge

            Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.

            Menezes' death was a tragedy, but that was fourteen years ago. There are people old enough to vote today, who are too young even to remember that story. If you can't come up with some more contemporary examples than that, you should consider the possibility that perhaps you really are making a lot of fuss about nothing.

            In a bad year (such as 2016 and 2017), police in the whole of the UK may kill as many as six people, including deaths in custody. That's far fewer, per population, than France, Germany, Italy, Australia or Canada, and don't even ask about the US. I know it may not feel like it, but the facts speak for themselves - the UK (still) has one of the most civilised police cultures in the world.

            1. Robert Carnegie Silver badge

              Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.

              British police have indeed killed lots of people since Menezes, usually by trying to. Usually, either they put across the story that it was unavoidable to kill the suspect, or they were black or on drugs or mentally ill and so there isn't much of a fuss.

              https://www.inquest.org.uk/deaths-in-police-custody if I'm counting right is showing about 1 death in British police custody per week since 1990. That evidently does include the Westminster Bridge terrorists who it's difficult to dispute had it coming, but I think also it's about the rate of deaths at the hands of an abusive partner or a mentally ill person, quite roughly, which are considered to be undesirably many. Mind you, if your partner is a mentally ill police officer and does you in then you'll be counted as all of those.

            2. Anonymous Coward
              Anonymous Coward

              At Veti

              Some of us are old enough to no longer be able to remember it

          2. Scroticus Canis
            Facepalm

            Wooosh - you miss the point - there was no automatic facial recognition in use ...

            ... so why is this killing relevant to the the use of said equipment?

            "At around 9:30am, officers carrying out surveillance saw Menezes emerge from the communal entrance of the block..." according to Wikipedia.

            The eyeballing operative had CCTV prints of the suspects and thought he might be of interest. However he had his dick in his hands at the time and thus could not film de Menzes to send images to another Dick who was the Gold Commander of the operation. Thus it's the dicks which caused the evolving cock-up to go lethal.

            So why link it to a technology which wasn't even in use at the time? Or do you think ancient CCTV is automatic facial recognition?

    4. Mr Dogshit
      FAIL

      Re: It's in its infancy, but it will improve

      * You have a personal tracing device in your pocket RIGHT NOW (your phone)

      No doubt my Doro can be traced to the nearest mast, but it doesn't phone home to Google every 10 seconds with GPS co-ordinates.

      * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      No I don't.

      * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      No I don't.

      * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      No I'm not.

      * You are happy to be tracked RIGHT NOW (advertising)

      No I'm not.

      1. Lee D

        Re: It's in its infancy, but it will improve

        Apart from the smartphone, same.

        (No, Google, I don't want to use your "enhanced accuracy" location either, thanks. I'm quite happy with "to the nearest metre or so" plain old GPS, thanks.)

        1. Dagg

          Re: It's in its infancy, but it will improve

          Apart from the smartphone, same.

          I have smart phone, GPS and wifi is always off until I actually need to use it. Saves battery as well.

          1. TheMeerkat Bronze badge

            Re: It's in its infancy, but it will improve

            You are in an extremely tiny minority.

            1. illuminatus

              Re: It's in its infancy, but it will improve

              Ah, the wisdom of crowds...

          2. Richtea

            Re: It's in its infancy, but it will improve

            > I have smart phone, GPS and wifi is always off until I actually need to use it.

            You do realise that it's not beyond the wit of Google to enable location without your permission, right?

            https://crisisresponse.google/emergencylocationservice/how-it-works/

            It fires an SMS with your location to the emergency services. You won't find the location SMS that was sent in your outbox, it's supressed. Nice and silent.

            In this case it's definitely for 'the greater good', Sargeant Angel, but there's no opt-in or opt-out.

            The feature is driven by on-device logic built into Play Services, but it would be very little effort to target an individual device with one more flag: 'track this user on any interaction and send location SMS to emergency service 5').

            1. Anonymous Coward
              Anonymous Coward

              Re: It's in its infancy, but it will improve

              How they gonna do that on my none Android phone?

          3. Drew Scriver

            Re: It's in its infancy, but it will improve

            Unless you periodically test what your phone is tracking and sending all you have is the companies' and the developers' word that it's indeed not turned on.

            Even old VHS-cameras had a feature to disable the red recording indicator...

          4. Anonymous Coward
            Anonymous Coward

            Re: It's in its infancy, but it will improve

            "GPS and wifi is always off until I actually need to use it. Saves battery as well."

            (ignoring the fact that they can track you from the cellphone masts)

            BBC Weather is starting to piss me off as it has just started to prompt for 'track your location' despite it have been working perfectly well for years (and across 5 devices) with location turned off and just 'Heathrow' as a favourite location

        2. doublelayer Silver badge

          Re: It's in its infancy, but it will improve

          * You have a personal tracing device in your pocket RIGHT NOW (your phone).

          With as much tracking turned off as I can, and if I was worried that people were actively tracking me with it, I'd leave it at home.

          * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

          None of those. A few things have microphones and internet connections but I've set them up and know what they're doing. If I was worried that people were actively tracking me with them, I'd disconnect either the microphone or the connection.

          * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

          None of those at the moment, but I once had an activity tracker that I gave away because I didn't use it. It monitored my heart rate during exercise, and could send it to my phone but I never enabled that. So it was a tracker whose tracking data only went to me, and it lacked the technical ability to report on me. If I was worried that people were actively tracking me with it, somehow circumventing the limitations of the device making this impossible, I'd leave it behind.

          * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

          None of those. I prefer passwords to log into my computer, and no Facebook account. If I did use a facial recognition system, I'd do so in such a way that the recognition was done using local processing on local data only.

          * You are happy to be tracked RIGHT NOW (advertising)

          I am not happy. That's why I have ad blockers, tracker blockers, and a DNS filter. Even that is tracking for economic purposes, not complete surveillance, so is not as bad an abuse as what has been considered (and done already) by governments.

      2. Neil Barnes Silver badge
        Big Brother

        Re: It's in its infancy, but it will improve

        What Mr Dogshit said, in spades.

        And the secondary points: as a matter of law, should the police be allowed to keep my images, movements, fingerprints, DNA etc. if they don't arrest, charge, and find me guilty? I don't think so; somewhere in the deep and dusty corners of UK law there is the presumption of innocent until proven guilty.

        This kind of thing is basically saying "Hey Mr Citizen, you're a criminal. We're just waiting to find out what the crime is." My sympathies are entirely with that chap who was arrested for covering his face; I would have done the same.

        I'm not sure which I dread more: a facial recognition system with a massive error rate, or one that's a hundred percent accurate...

        1. jmch Silver badge

          Re: It's in its infancy, but it will improve

          "as a matter of law, should the police be allowed to keep my images, movements, fingerprints, DNA etc. if they don't arrest, charge, and find me guilty?"

          No, they are not allowed.

          Yes, they still do it anyway with no consequence

        2. John Brown (no body) Silver badge

          Re: It's in its infancy, but it will improve

          "somewhere in the deep and dusty corners of UK law there is the presumption of innocent until proven guilty."

          I'm sure you know this and it was just a brain-fart, but the operative word is "unless", not "until". "Until" presumes guilt, they just don't what of, yet.

          1. Neil Barnes Silver badge

            Re: It's in its infancy, but it will improve

            Indeed. My typo. I was trying so hard not to write 'until'...

            1. TimMaher Silver badge

              Re: It's in its infancy, but it will improve

              Unfortunately, if things keep going this way, “until” was absolutely right.

      3. DiViDeD

        Re: It's in its infancy, but it will improve

        "* You have a personal tracing device in your pocket RIGHT NOW (your phone)

        No, no you don't. the GPS receiver in you phone knows where you are (the clue's in the word 'receiver'). It doesn't tell anyone else where you are unless you've foolishly set it up to broadcast your position. Why do you think trans Pacific flights disappear from ground station view even though those on board know exactly where they are?

        Your mobile service will know you are within the range of mobile tower X, if anyone bothers to go check the logs, but nobody knows where you are to the metre apart from you. Unless, as mentioned, you've decided to broadcast your position to all and sundry.

        EDIT: Could someone please tell El Reg's spillchucker that that is, indeed, how you spell 'metre'?

        1. Portent

          Re: It's in its infancy, but it will improve

          Yes you do have a tracking device in your pocket. Google has previously been found to track you based on mapping all the wifi routers in your area. Based on the strength of each signal it is able to position you surprisingly accurately. That's why, whenever you turn on GPS on an Android phone, it asks if it can track wifi to make it 'more accurate'.

          1. Jimmy2Cows Silver badge

            Re: It's in its infancy, but it will improve

            I never understand why people leave their wifi on all the time. Battery hog, security risk and tracking risk. Just turn the thing off until you need it.

        2. Loyal Commenter Silver badge

          Re: It's in its infancy, but it will improve

          EDIT: Could someone please tell El Reg's spillchucker that that is, indeed, how you spell 'metre'?

          You do know that spielchucker is in your browser, don't you?

          1. BrownishMonstr

            Re: It's in its infancy, but it will improve

            Absolute lies, every website has to implement their own spell checker.

      4. Phil Kingston

        Re: It's in its infancy, but it will improve

        Even trying to actively avoid all that stuff you leave enough of a trail for someone with the right access to get a good handle on you.

    5. mr-slappy

      Re: It's in its infancy, but it will improve

      "If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers"

      Um, how is the AI going to be trained?

      1. Lee D

        Re: It's in its infancy, but it will improve

        The "AI" (pfft) has proven itself to be far more biased and has much more trouble picking out features on less-contrasting skin tones (i.e. darker ones with no lighter features, as opposed to lighter one which universally have darker features in places).

    6. David 18

      Re: It's in its infancy, but it will improve

      Ignore it, it's obviously a poorly programmed Russian or Chinese Troll-Bot.

    7. Chris G Silver badge

      Re: It's in its infancy, but it will improve

      Matt! Is that you?

      1. Sir Runcible Spoon

        Re: It's in its infancy, but it will improve

        Impossible, the lack of the word 'sheeple' is ample proof.

        Besides which I don't believe even Matt could be that asinine.

    8. Cynic_999

      Re: It's in its infancy, but it will improve

      I am aware of the way that commercial organizations use technology and track private citizens, and I am far from happy with the way that has developed, and I doubt that anyone else who understands what's going on is happy either. But the threat from commercial exploitation where the motive is profit is nothing compared to the damage a state actor can do to people through its use or misuse of the technology. The government's chief motive, no matter what rhetoric it is using to justify what it's doing, is to control the population to make us collectively behave in a way that is beneficial to those in power.

      I recall when CCTVs were first being introduced. There were many articles about how it would make us all safer etc. The citizens of one large village thought it sounded like a great idea and petitioned to get CCTV installed ASAP to stop the small but annoying amount of graffiti and vandalism. Within days of the CCTV being installed it was successfully cutting down on crime - the local pub landlord was successfully prosecuted for allowing the locals to stay too long after closing time, and the local parking wardens were increasing the number of fines 10 fold by using the CCTV to look for illegally parked vehicles. Not exactly the sort of crime reduction the people had in mind. And the graffiti and vandalism? They remained unchanged - the police said the CCTV was ineffective as the culprits covered their faces and didn't stay around long enough to be caught in the act.

    9. Anonymous Coward
      Anonymous Coward

      Re: It's in its infancy, but it will improve

      Not all of us are tracked in the way you suggest. My smartphone has data and location services turned off most of the time, and the cell phone tower data is not available to the police without a warrant.

      I don't use Facebook, Facetime, or any video messaging service (although I have a mostly unconfigured Facebook login)

      I don't have an Amazon, Google, Smart TV or any other voice assistant device in my home, and I resist having IoT devices as well.

      I can't do much about advertising tracking, I admit, but that is fairly minor, especially when I say "No" to sharing location information when I visit web sites. I regularly clear my cookie cache.

      If the facial recognition was a "compute hash, check hash against watchlist, delete hash if not on watchlist", then I would reluctantly support this technology. But it won't be. As we've seen from several reports about fingerprint and DNA data (and I would also expect ANPR info and congestion/ULEZ info), the Police are reluctant to discard data even when they are legally obliged to do so "just in case it proves useful later", and I suspect they may want to positively identify everybody that comes into the field of view regardless of whether they are on any watchlist.

      I'm dreading the town centre and in-shop CCTV footage being automatically scanned by machine, because it will seriously undermine our rights, and it's not so far fetched with some of the cloud services available off-the-shelf now.

      You only have to listen the many, many broadcast interviews with members of the police to realize that the police regard everybody as suspects.

      1. Anonymous Coward
        Anonymous Coward

        Re: It's in its infancy, but it will improve

        The old (and quoted from police spokespeople here multiple times in the past) "There are those are staying just on the right side of law that we are very keen to do something about and that <insert group> are demanding action over"

        Eseentially we don't like those who challenge our authority. AC because Police Scotland smashed in the door in a 7 man armed raid of the last person who challenged their authority

      2. Richtea

        Re: It's in its infancy, but it will improve

        > My smartphone has data and location services turned off most of the time, and the cell phone tower data is not available to the police without a warrant.

        Not true in the UK, in 'special' cases. You ring on Android, they know your location - and you didn't opt in:

        https://crisisresponse.google/emergencylocationservice/how-it-works/

        1. Anonymous Coward
          Anonymous Coward

          Re: It's in its infancy, but it will improve

          People keep talking about Android like anyone who gives a shit about security is using an android phone. Pfft.

    10. Captain Hogwash
      Holmes

      Re: It's in its infancy, but it will improve

      The technology still needs to advance, but once it have [sic] it will be a MASSIVE BOON to society the party by helping us to identify risk dissident individuals before they have would have become known by traditional means. More importantly, it will help prevent the police from wasting their time and innocent people who happen to be no longer voting for "the wrong colour".

    11. jmch Silver badge

      Re: It's in its infancy, but it will improve

      Dear AC.

      I score 1/5 on your list (just the phone). Location tracking and history is turned off (although I am aware that Google could be saying it's off and collecting it anyway). I am also equally aware that my mobile service provider tracks me from mast to mast, and would still do so if I had a dumb phone rather than a smart phone. That data is secured and timebarred by legal provisions backed up by massive possible fines for misuse or loss due to botched security. Nothing's 100% secure, but it's secure enough for me to balance against the convenience of being able to make and receive calls at any time and have a super-powerful portable nanocomputer always available.

      Face recognition by the met fails on so many counts, to begin with the "trawl vs index search". If they are looking for a single individual, and search their archives for matches of that individual. If they're trying to match every feed they have with every known person in their database, it's a trawling expedition. The first MIGHT be OK if they have a warrant, the second is a big no-no. That's before arriving at the actual legal basis to scan real-time and/or store this data (of which there is none), and the technical capability of the system to accurately match people (which seems to be equivalent to that of Mr Magoo)

    12. Anonymous Coward
      Anonymous Coward

      Re: It's in its infancy, but it will improve

      "* You have a personal tracing device in your pocket RIGHT NOW (your phone)."

      Running Symbian and off, so ... technically yes, but to all intents and purposes, no.

      "* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)"

      No (well, I have a smart TV, but it can't talk to the internet, due to not being connected to anything)

      "* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)"

      I do not

      "* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)"

      Swing-and-a-miss

      "* You are happy to be tracked RIGHT NOW (advertising)"

      uBlock Matrix + NoScript says probably not.

      "Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on."

      Or, more liklely, "computer says you're a criminal, the ID's probably fake, come with us while we get it checked" (see how they handle photographers in London*). That will appear on the enhanced disclosure in the DBS process.

      If you don't wish to search, try:

      https://www.theguardian.com/uk/2010/may/10/stop-search-photographer-grant-smith

      https://www.theguardian.com/uk/2009/dec/08/police-search-photographer-terrorism-powers

      "If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers."

      What do you think an AI trained on an imperfect, imbalanced data set will behave like. If you need a clue, have a look at other articles on here ...

      "Get your heads out of your collective bum."

      Maybe someone needs to take their own advice ...

    13. Anonymous Coward
      Anonymous Coward

      Re: It's in its infancy, but it will improve

      The thing is.... I'm not happy about the tracking. Unfortunately its becoming harder and harder to avoid. Just because its all pervasive doesn't mean willingness or acceptance and sooner or later this will all come to a head.

    14. Anonymous Coward
      Anonymous Coward

      Re: It's in its infancy, but it will improve

      Oh look its a met police worker or just one of the usual "useful idiots"

    15. Doctor Syntax Silver badge

      Re: It's in its infancy, but it will improve

      * You have a personal tracing device in your pocket RIGHT NOW (your phone).

      Only when I remember to take it with me. And I minimise what's on it.

      * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      You might have them. I don't and won't. I can't imagine anything they'd be useful for. TV smarts are provided my MythTV and Kodi. I control them, not the other way around.

      * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      Nope. Again, can't imagine having a use for them. As to internet connected fridge - ROFLMAO.

      * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      Facebook? Absolutely not. Windows, Apple? No, Linux..

      * You are happy to be tracked RIGHT NOW (advertising)

      Poor little A/C. Never heard of ad blockers, NoScript and all the rest of the armour the security minded use.

      Apart from any other consideration has it not occurred to you that one of the requirements to live freely under the law is that the police should follow the law themselves? When it's considered likely that a challenge to their legality would likely be successful then we really should be concerned.

    16. Anonymous Coward
      Anonymous Coward

      Re: It's in its infancy, but it will improve

      "Get your heads out of your collective bum."

      I think that is where your head has been if you are totally unaware of the flaws with the current systems and the inability of the system to deal with the 'Wrong Colour' problem by simply matching *anyone* of Colour with *any* random 'person of colour' on file. (Not sure that is an improvement on the 'Stop & Search' we have now !!!)

      It is not good enough to say 'when the Technology advances' it will solve all our problems, as so many of the great Technological leaps never happen because the Technology is not as good as thought and the problem is a 'little' bit more difficult to solve than admitted !!!

      It will be all right on the night does not work with technology ........ ever !!!

    17. Cpt Blue Bear

      Re: It's in its infancy, but it will improve

      I fear, Mr Coward, you have completely missed your own point. You may have made that point accidentally and be completely unaware, mind.

      In my experience the people who tick your boxes are unaware of your points. Those of us who are avoid or mitigate their effect.

      Those points have been dealt with by other posters.

      What they haven't addressed is your "MASSIVE BOON" (initially mistyped as BOOB - make of that what you will). You seem to be a fan of arresting people based on "risk" rather than their actions. Welcome to the world of thought crime, guilt by association arbitrary arrest. When you start arresting people for what they might do rather than what they have done you have well and truly left any notion of justice far behind.

      I also note you say "us". Clearly, you don't ever envisage being on the receiving end of this. That is telling and makes me wonder exactly who should be removing their head from their posterior...

    18. Jimmy2Cows Silver badge
      FAIL

      Re: It's in its infancy, but it will improve

      Hmm let's go through your list...

      * You have a personal tracing device in your pocket RIGHT NOW (your phone).

      Wifi off. Data off. GPS off. Sure it's connected to a nearby mast but so is every mobile phone, being, you know, a basic requirement to work as a phone.

      Try again...

      * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      My TV doesn't have a microphone. Don't see the point of digital 'assistants'. Games consoles, yeah but they're off and I choose to log into its network services. I don't have to.

      Try again...

      * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      Wrong on all counts. Try again...

      * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      Let's see... don't use Facebook or any of that social media nonsense. Don't have any Apple products. PC camera is off unless needed for video conference. Plus the fact that facial *detection* is not the same as facial *recognition*.

      Try again...

      * You are happy to be tracked RIGHT NOW (advertising)

      I'm certainly not happy about it, however it's impossible to do my job without using a PC, and impossible to buy anything (physically or online) without accepting that I'm probably being tracked. So I tolerate it as an inescapable evil. I don't have to like it.

      Zero out of five.

      As to get in a froth when new technology is applied it's not that it is being applied, it is *how* it's being applied - questionable training sets, questionable accuracy, questionable retention policies, questionable legality.

    19. illuminatus

      Re: It's in its infancy, but it will improve

      "30 seconds out of your day to provide ID and carry on"

      Then multiply by all the times you will be asked to do it, by every person and their dog, because it's "30 seconds out of your day". Feature creep's a bitch.

    20. Anonymous Coward
      Anonymous Coward

      Re: It's in its infancy, but it will improve

      I'm sorry, apart from carrying a phone I have/do none of those things and I get by just fine.

    21. Toni the terrible
      Mushroom

      Re: It's in its infancy, but it will improve

      No,

      My mobile phone is often left at home.

      I have only the one Smart TV and it doesnt follow me around

      I dont belive I have any behaviour monitoring devices, except for the PC. No home automation as it is unneeded.

      I have nothing that uses facial recognition, and keep away from social media, except this site.

      I block all advertising, so I am not happy to be tracked by it.

      As another guy said, whats the requirement for me to carry 'papers' to be ID'd and what are those the Plod will accept? Though I do anyway - Photo Driving Licence etc.

      it depends on the data set bias if the systems will be non-discriminatory, as it is if you are black yo will be picked up the system more than if you are white (AI Constable Slaughter Lives).

      So, until they get it right/better than we do have legitmite concerns, so stick that up your lower oriface

    22. Loyal Commenter Silver badge

      Re: It's in its infancy, but it will improve

      * You have a personal tracing device in your pocket RIGHT NOW (your phone).

      Actually, it's on the desk, and location services are turned off except for the apps that I allow.

      * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      Nope, nope, and turned off except for when in use, when it's unlikely to be recording much of interest, as it's used mainly for streaming services, which my household members tend to shut up during.

      * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      Okay, so I have an activity tracker, but do you know what? I have consented to what they do with my data there, and it's constrained heavily by GDPR. It's not like I don't have a choice. As for internet connected fridge - are you actually serious? I'd sooner have Talkie Toaster in my house.

      * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      IIRC, on FB, you have to turn facial recognition on, whcih I certainly would not do. I also cripple a lot of FB's tracking, etc. by the use of plug-ins (FB purity for example), and ad-blocking. I don't believe Windows uses any sort of facial recognition - and good luck to it, since I only plug a camera into the thing when I need to (which is round about half past never). As for Apple. Well, just no.

      * You are happy to be tracked RIGHT NOW (advertising)

      NoScript, and AdBlock are prerequisites. Needless to say, not only am I not happy to be tracked by advertising cookies, I actively take measures to avoid that, as well as actively taking measures to not see the fucking things in the first place.

    23. Loyal Commenter Silver badge

      Re: It's in its infancy, but it will improve

      30 seconds out of your day to provide ID and carry on.

      Ihre pappieren bitte!

    24. keith_w

      Re: It's in its infancy, but it will improve

      * You have a personal tracing device in your pocket RIGHT NOW (your phone).

      Do.

      * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      Do Not

      * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      Do Not

      * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      Do Not

      * You are happy to be tracked RIGHT NOW (advertising)

      Am Not

      I do need the cell phone, so I put up with that. I do not need any of the other things, so I do not and will not have them in the house. And I am not happy to be tracked for advertising. Please do not assign to me your attitudes towards any of this stuff or anything else for that matter.

    25. Anomalous Custard

      Re: It's in its infancy, but it will improve

      >The knuckle-dragging Luddites always get in a froth when new technology is applied.

      Seems an odd insult to lob at a tech audience, but whatever.

      >* You have a personal tracing device in your pocket RIGHT NOW (your phone).

      Actually it's on my desk ;) And I have as much of the tracking stuff as I can turned off most of the time

      >* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      Nope. The TV is dumb, I have a phobia about digital assistants, and I don't use or enable voice on my consoles

      >* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      My activity trackers are tracking the activity of the drawers they're languishing in. My fridge cannot connect to the internet (unless it's gained sentience and can now walk to a computer). I do have "smart" bulbs (not my idea) - which will tell anyone tracking that we turn the lights on when it gets dark and turn them off around the same point every night.

      >* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      I have facial recognition on my Windows tablet. Which actually only seems to recognise one pair of glasses and not my actual face. Or any of my other glasses.

      >* You are happy to be tracked RIGHT NOW (advertising)

      My ad blocker etc usage would suggest otherwise.

      My tracking bingo card tells me you forgot to mention travel cards such as Oyster, bank cards, loyalty cards and online shopping.

      >If you are happy with all of those (and it seems you are given the up-take), then why are you getting >your gusset in such a twist of over the Met applying technology to public safety? Oh whoops, it's a >false positive. Big deal. 30 seconds out of your day to provide ID and carry on.

      I don't live in a country where carrying ID is mandatory, and as I have no need to carry it on a regular basis I don't.

      >If anything it will IMPROVE matters massively for those affected by the racist stop-and-search >policies as the AI system won't have the inherent biases of the prejudicial police officers.

      Hahahahanope. As others have mentioned it has difficulties telling darker skinned faces apart(*) , and add to that the biases of those who will be training it, then it's really not going to make things better.

      >The technology still needs to advance, but once it have it will be a MASSIVE BOON to society by >helping us to identify risk individuals before they have would have become known by traditional >means.

      We've had incidents where people have been deemed to be at risk of causing violence by those closest to them, who have reported those people to the authorities, who've not taken action. This is people being reported by those who know them well, and who are in a good position to judge change in character, behaviours etc. What on earth makes you think facial recognition will improve this? How does "this face looks vaguely like this other face" improve on "this person has become more extreme in their views and I have reason to believe they will carry out their threats of violence"?

      >More importantly, it will help prevent the police from wasting their time and innocent people

      >who happen to be "the wrong colour".

      Only if it improves enough to be able to detect differences in all skin tones equally. Only if accuracy improves so police aren't wasting their time chasing down people who look a bit like someone else. Only if you think the police fail to apprehend the "correct" people because they're wasting time chasing after innocent people. Only if you thing human biases won't affect how human police officers interpret the results of the AI.

      >Get your heads out of your collective bum.

      I think it's you has your head in yours.

    26. martinusher Silver badge

      Re: It's in its infancy, but it will improve

      Pushing back against facial recognition is a bit of a waste of time. Facial recognition is what cops do so denying them the use of a machine that will help do this is just not going to work. Sure, facial recognition is inaccurate but its probably no worse than being identified by a witness (something that's notoriously inaccurate -- but nobody tries to ban eye witnesses).

      Where you need to concentrate the fight is things like generating spurious criminal charges arising from concealing your face. Sure, its inconvenient for law enforcement that we're all neither bar coded nor microchipped like a pet but that's all part of their job -- nobody mandated that criminals and public alike were obliged to make their job easy.

      1. doublelayer Silver badge

        Re: It's in its infancy, but it will improve

        "Pushing back against facial recognition is a bit of a waste of time. [...] Where you need to concentrate the fight is things like generating spurious criminal charges arising from concealing your face. [...]"

        I'm not sure whether to upvote you for your last point, downvote you for your first point, or just boggle at how your last point almost directly contradicts your first point. Facial recognition equipment is in the same category as charging people for not letting them use their facial recognition equipment on you. They're two sides of the same coin, yin and yang. Since we both agree that charging people for hiding their faces is wrong, let's look at the first point. Having that equipment allows them to do the same kind of tracking. It makes it impossible for citizens to have privacy unless they specifically try to, in which case they will be charged. It is not a thing we should just accept, because in addition to it actually being illegal according to current laws, it is so unpalatable to those who like human rights that it should be made even more illegal through additional legislation.

        Your comment that "Facial recognition is what cops do so denying them the use of a machine that will help do this is just not going to work" is rubbish for two primary reasons. First, there are plenty of things that cops do, and we accept, but we don't want to extend their abilities. Cops search suspects' houses for incriminating information, when they have a warrant. We could extend this by not requiring a warrant, but we don't because we don't want the police to have that power. We only want them to search places when they have a warrant to do so. Second, facial recognition is not the primary job of a police officer. Even those officers who work directly in public and not, say, investigating existing crimes aren't there to look at everyone's face and determine if they have seen it on a list. They're there to identify crimes and safety risks and deal with them. In almost all cases, they have not seen the perpetrator before, but they still go after them. If the police said they were going to throw away this system and instead employ a bunch of officers whose job it was to go to everyone and stare at their face to identify whether it's on a list, I wouldn't be any happier.

    27. Kiwi
      FAIL

      Re: It's in its infancy, but it will improve

      * You have a personal tracing device in your pocket RIGHT NOW (your phone).

      Nope, often leave the phone at home, or in the car, or...

      * You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)

      No game console, no TV, no "digital assistant".

      * You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)

      Basic fridge, any "activity tracker" would die of boredom/lack of exercise, and I don't do enough to warrant "home automation".

      * You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)

      Linux, no social media (except El Reg), basic dumbphone

      * You are happy to be tracked RIGHT NOW (advertising)

      Adblockers, privacy tools and noscript.

      Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on.

      Nope, doesn't work like that. 1) You're working on the assumption that the target is identified - what if you fit the description of a 'suspect' but where there is no identification, IE a person fitting your description was involved in a crime somewhere in your local area? That ain't gonna be 30 seconds. If the crime is serious enough and you cannot prove with absolute certainty where you were at the time (and bear in mind the pigs can change when the crime was committed on a whim, just to make sure they get you if they've taken a dislike to you -"yes, the CCTV timestamp says it was at 11:05 and the recorder's time is correct now, but since he can prove he was elsewhere at that time maybe he himself hacked it and changed it then changed it back". You're accused of a crime, think the jury will believe your claims that you couldn't possible have made such a hack? You're already guilty.

      So if you're accused of a crime you didn't commit and cannot prove instantly that you didn't do it - and your only chance of that is if they get someone who looks enough like you and confesses - then you might find yourself spending a few days or even months in prison awaiting trial. All coz the computer said you matched the description.

      But even if they're after an identified person, until the police test your fingerprints and perhaps DNA, you ID - assuming you're carrying at the time - isn't proof. I've got a decent scanner, and a laminator, I reckon I could probably whip up a passable fake ID in a fairly quick time (I've never tried it so maybe not). Maybe it's bloody hard to do and I'd need equipment far beyond even what Bill Gates can buy, but the coppers are going to believe that fake ID is trivial and arrest you on the chance that it's fake. Again, you're at least spending the night in pokey till they can verify your ID, or get the person they're after. Sucks to be you if you're someone who works with stuff that destroys fingerprints, as even a small amount of damage means you'll be suspected of trying to hide your prints so they'll make extra sure they have the right person before releasing them.

      But never mind. So what if an innocent person spends months in jail over a computer error? That'll make the rest of the world safer. I just hope you're the next innocent person to have to spend time in prison, you might think otherwise.

      If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers.

      Have you considered who the system is being tested/designed by?

      More importantly, it will help prevent the police from wasting their time and innocent people who happen to be "the wrong colour".

      And yet the evidence from the world over says otherwise.

  4. Abdul-Alhazred

    "Dubious Legality"???

    So by implication you guys still have rule of law?

    That's a relief. :)

    1. Yet Another Anonymous coward Silver badge

      Re: "Dubious Legality"???

      > you guys still have rule of law?

      Yes it just doesn't apply to the police or government

  5. adam payne

    "We are extremely disappointed with the negative and unbalanced tone of this report."

    We are extremely disappointed because it didn't sing our praises.

    the deployments have been successful in identifying wanted offenders.

    How many successfully identified over all the trials?

    1. David 18

      Exactly, with the sort of accuracy it seems to have, they would probably get a higher percentage of wanted offenders v false positives if they just had a simple randomising program telling them when to stop a passer-by.

    2. Anonymous Coward
      Anonymous Coward

      "How many successfully identified over all the trials?"

      Thousands. The MPS just don't know what crimes have been committed by those individuals yet.

    3. veti Silver badge

      The fact is, every time we see these statistics we only ever see one side, usually the false positive rate.

      What was the false negative rate? Without knowing that, we don't know whether it's a good deal or not.

      If you're screening a million people, and you get 42 alerts, of which 8 turn out to be correct - that means you've checked out 42 people, instead of a million, to identify your 8 targets. That's a pretty good deal.

      If the original sample included another 1000 people who should have triggered matches, then - no, it's not good. But if it only included half a dozen or so, that's not bad.

      1. doublelayer Silver badge

        I'm not a downvoter, but your question is unanswerable and missing the point. Nobody knows how many people were present, as they didn't test it on that. Also, most of us here, myself included, are not that happy having a 80% rate of someone innocent being taken in for questioning on the back of a system that violates citizens' rights.

  6. Anonymous Coward
    Anonymous Coward

    Why do they keep saying it doesn't work?

    https://www.cognitec.com/independent-vendor-tests.html

    false non-match rates lower than 0.01, at a false match rate of 0.0001

    That took me a minute to find and it's not like they don't spend the cash on it.

    1. Anonymous Coward
      Anonymous Coward

      Re: Why do they keep saying it doesn't work?

      "false non-match rates lower than 0.01, at a false match rate of 0.0001"

      I'd bewary of those values, they come from a summary line that makes em look good, but quoted without understanding the underlying methods (or any reference to what they actually mean) tells you nothing of value.

      does that mean that 1 in 100 criminals wont be spotted?

      or that looking at 100 images if 1 was a criminal he will be missed?

      and the 0.0001 does that mean that looking at 1,000,000 images 100 of those people will be mistakenly arrested?

      Understanding what it means in practice is very important especially with dangerous 1984 shit like this

      1. Androgynous Cupboard Silver badge

        Re: Why do they keep saying it doesn't work?

        Sizable upvote for Mr Coward and his correct understanding of the numbers

        I can only add some context: roughly two million people attend the Notting Hill Carnival each year. At the false match rate quoted, that is 200 innocent people considered for arrest, based on nothing more than an algorithm, over one weekend in just one small part of the city.

        1. John Brown (no body) Silver badge
          Thumb Up

          Re: Why do they keep saying it doesn't work?

          ...and even that is a dangerous assertion as it assumes that there were enough cameras in enough places to get a usable image of every attendee.

        2. Anonymous Coward
          Anonymous Coward

          Re: Why do they keep saying it doesn't work?

          Thank you.

          Also important is the officers understanding and belief in the accuracy.

          For example with the jeremy kyle shit show, people were being told that polygraphs (lie detectors(they are not)) are 100% (or 90%+) a stupidly high figure. the reality is more like about 60% (pretty much useless)

          BUT due to the lie of it being accurate JK and his audiences berated someone to the point of suicide, the poor guy didn't know how crap the true rate was and therefore could not defend or understand why he failed.

          If police have the same wrongly placed confidence, no matter how much a suspect flagged by the false positive tries to argue his innocence, the officer will just treat him worse thinking he's a liar.

          That is a very dangerous situation, imagine a false match on a terrorist to an innocent person, guns get drawn, everyone gets twitchy and bang...

          1. John Brown (no body) Silver badge

            Re: Why do they keep saying it doesn't work?

            "That is a very dangerous situation, imagine a false match on a terrorist to an innocent person, guns get drawn, everyone gets twitchy and bang..."

            They can even manage that without the use of expensive facial recog. Especially if you are Brazilian.

        3. Charlie Clark Silver badge

          Re: Why do they keep saying it doesn't work?

          I assume the new system is called Savage.

      2. Electronics'R'Us Silver badge
        Thumb Up

        Re: Why do they keep saying it doesn't work?

        Well said.

        I am reminded of a line by Mark Twain (he was referring to politicians): "There are lies, Damned lies and Statistics".

        1. Dagg

          Re: Why do they keep saying it doesn't work?

          My preferred version is

          "There are lies, Damned lies, Statistics, Advertising, political promises and religion"

      3. Anonymous Coward
        Anonymous Coward

        Re: Why do they keep saying it doesn't work?

        I'll give you that, I didn't read it properly however I still don't believe the MET and it's assessment of accuracy because if it was 100% accurate maybe people would sit up and take notice.

    2. Anonymous Coward
      Anonymous Coward

      Re: Why do they keep saying it doesn't work?

      https://www.theregister.co.uk/2018/05/15/met_police_slammed_inaccurate_facial_recognition/

      https://www.independent.co.uk/news/uk/home-news/met-police-facial-recognition-success-south-wales-trial-home-office-false-positive-a8345036.html

      "I have told both police forces that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use" - Professor Paul Wiles, UK Biometrics Commissioner,

  7. Electronics'R'Us Silver badge
    Big Brother

    Legal Basis?

    "The MPS maintains we have a legal basis for this pilot period and have taken legal advice throughout."

    As you are a public body and such legal basis has no national security implications, we would love to see that actual legal basis and transcripts of your legal advice (and we definitely want to know who you got it from).

    The reality appears to be that you sneaked something through but without paying attention to all the relevant legislation, some parts of which may render all your expensive advice worthless.

    IANAL, but I do know technology (and probably better than the entire Met leadership team put together). AI / ML / NN is just the latest fad / buzzword (I really must get this on the grids of my bullshit bingo cards) that will not be properly ready for a long time yet.

    1. Warm Braw Silver badge

      Re: Legal Basis?

      The Annual Report of the Biometrics Commissioner is quite interesting in this respect. His assessment seems to be that there is no settled legal basis because currrent legislation refers only to DNA and fingerprints (and his report is fairly damning about police handling of those). He specifically says that, while there has been some legislative interest in Scotland, there has been none in Westminster because of the Brexit stasis and that the legality will be determined for the foreseeable future by case law:

      Two civil liberty groups, Liberty and Big Brother Watch, have sought judicial review against South Wales Police, the Metropolitan Police and Home Office, challenging the legality of the police action. Their concern is that the mass scanning and processing of the images of people in this way in public places is not proportionate as it constitutes a significant interference with the Article 8 rights of those affected and that such interference is “not necessary in a democratic society” or “in accordance with the law” under the European Convention on Human Rights (ECHR)41. We shall have to await the court judgments, but these cases are probably only the first challenges to the police use of new biometric technologies in trials. Actual deployment of new biometric technologies may lead to more legal challenges unless Parliament provides a clear, specific legal framework for the police use of new biometrics as they did in the case of DNA and fingerprints.

  8. Anonymous Coward
    Anonymous Coward

    "We want to ensure the public have complete confidence in the way we police London"

    Hmm. Are you absolutely sure you're going about these trials in the right way, then?

  9. Jim Birch

    Of course the article should say that the current version of the technology has some score. In fact, we know it will improve incrementally over time.

  10. Fruit and Nutcase Silver badge

    How to improve development

    As part of testing the algorithms, they need to have a pool of people comprised of the various stakeholders of the project, right up to the Home Secretary. Senior Officers from the Met, designers and coders of the algorithms etc.

    Now, during testing (invite representatives from the community to act as test subjects), each time the system gets it wrong, the next one from the pool of stakeholders has to endure a full blast from a taser. That should make good prime time TV and the BBC could transmit it...

  11. simonb_london

    I sympathise

    It's about the same success rate I have as me thinking I recognise someone, saying hello, and then........."Sorry, thought you were someone else"!

    1. Nick Kew

      Re: I sympathise

      So how well does that chat-up line work?

  12. TheMeerkat Bronze badge

    Whether we like it or not, this technology will be used and not just by the police but also by private individuals. The only question that stands is whether we will use the technology developed in the West, or we will be buying it from China.

  13. Anonymous Coward
    Anonymous Coward

    Metropolitan Police's facial recognition tech not only crap, but also of dubious legality

    which means, it will be rolled out, regardless :(

  14. Anonymous Coward
    Anonymous Coward

    We fully expect the use of this technology to be rigorously scrutinised

    In plain English:

    We fully expect the public to be rigorously scrutinised.

  15. Roj Blake

    I'm Happy for the Police to do this...

    Just as long as I get a live video feed of all police control rooms and the ability to track the movements of all police officers whether or not they're on duty.

    1. Anonymous Coward
      Anonymous Coward

      Re: I'm Happy for the Police to do this...

      "the ability to track the movements of all police officers whether or not they're on duty.:

      And politicians!

      And government 'officials'!

      Never forget to track them at all time, as well.

  16. Wincerind

    Hmm, I know it's a way off yet but its all starting to feel a bit "Minority Report".

  17. TimMaher Silver badge
    Alien

    Face masks

    By this point I would have thought that experienced commentards would have brought up the subject of pixelated face masks. Widely available on t’web.

    Looks like it’s down to me then.

    Sigh.

    1. Intractable Potsherd

      Re: Face masks

      The trouble is, they will probably be regarded as attempting to avoid facial recognition, and up with an arrest - like the bloke down south who was lifted for trying to hide his face (okay, technically they got him for swearing, but the precipitate was him hiding his face).

      If crime was very prevalent in British society, I might have a slightly different perspective on widely deployed facial recognition (though not likely very different - it is just plain wrong), but it isn't, and so I don't. The balance of rights v responsibilities doesn't favour applying hugely privacy-infringing technology for the small benefit it would give.

  18. This post has been deleted by its author

  19. PapaD

    Optical masking

    Turns up in films and some sci-fi books.

    The idea that you wear something that essentially makes it impossible for the camera to clearly see your face (either infrared LEDs - though, most decent cameras do have IR filtering), or some other form of non-visible light)

    Surely can't be that long until someone comes up with something. I know i've seen at least one video of people using an image on a card that prevents at least one type of facial recognition software from detecting you're even there. No idea how viable that would be unless you knew for sure the weaknesses of the system you would be recorded by.

  20. Anonymous Coward
    Anonymous Coward

    Doubtful legality...

    Like the researchers who installed a gait recognition enabled CCTV system they were creating in a University building without informing their dean, the relevant watchdog, campus security or even checking with the Universities ethics committee. They were mightly pissed off when a union rep called security who as the only authorised CCTV operators on site made them remove the kit.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like

  • Europol arrests nine suspected of stealing 'several million' euros via phishing
    Victims lured into handing over online banking logins, police say

    Europol cops have arrested nine suspected members of a cybercrime ring involved in phishing, internet scams, and money laundering.

    The alleged crooks are believed to have stolen "several million euros" from at least "dozens of Belgian victims," according to that nation's police, which, along with the Dutch, supported the cross-border operation.

    On Tuesday, after searching 24 houses in the Netherlands, officers cuffed eight men between the ages of 25 and 36 from Amsterdam, Almere, Rotterdam, and Spijkenisse, and a 25-year-old woman from Deventer. We're told the cops seized, among other things, a firearm, designer clothing, expensive watches, and tens of thousands of euros.

    Continue reading
  • UK police to spend tens of millions on legacy comms network kit
    More evidence of where that half-a-billion-a-year cost of Emergency Services Network delay is going

    The UK's police service is set to spend up to £50 million ($62.7 million) buying hardware and software for a legacy communication network that was planned to become obsolete in 2019.

    The Home Office had planned to replace the Airwave secure emergency communication system, which launched in 2000, with a more advanced Emergency Services Network by the close of the decade. However, the legacy network has seen its life extended as its replacement was beset with delays. The ESN is expected to go live in 2026.

    In a procurement notice, the Police Digital Service (PDS) said it was looking for up to three suppliers of Terrestrial Trunked Radio (TETRA) Encryption Algorithm 2 (TEA2) compatible radio devices – including handheld, desktop, and mobile terminals – as well as software, accessories, services, and maintenance for use on the UK Airwave system.

    Continue reading
  • Super-spreader FluBot squashed by Europol
    Your package is delayed. Click this innocent-looking link to reschedule

    FluBot, the super-spreader Android malware that infected tens of thousands of phones globally, has been reportedly squashed by an international law enforcement operation.

    In May, Dutch police disrupted the mobile malware's infrastructure, disconnecting thousands of victims' devices from the FluBot network and preventing more than 6.5 million spam text messages propagating the bot from reaching potential victims, according to Finland's National Bureau of Investigation on Wednesday.

    The takedown followed a Europol-led investigation that involved law enforcement agencies from Australia, Belgium, Finland, Hungary, Ireland, Spain, Sweden, Switzerland, the Netherlands and the US. 

    Continue reading
  • Clearview AI wants its facial-recognition tech in banks, schools, etc
    I get knocked down but I get up again, Italy, Canada, UK, ACLU, Facebook, Google, YouTube, Twitter... are never gonna keep me down

    Clearview AI is reportedly expanding its facial-recognition services beyond law enforcement to include private industries, such as banking and education, amid mounting pressure from regulators, Big Tech, and privacy campaigners.

    The New York-based startup's gigantic database contains more than 20 billion photos scraped from public social media accounts and websites. The database was used to train Clearview's software, which works by performing a face-matching algorithm between input images and ones stored on its database to identify individuals.

    These images were downloaded without explicit permission from netizens or companies. Although Clearview has been sent numerous cease and desist letters from Twitter, YouTube, Google, Facebook and more, it continued to collect more images and grow its database. The demands to stop scraping public-facing webpages, however, were not legally binding, unlike the settlement agreement Clearview entered into to end its lawsuit against the American Civil Liberties Union.

    Continue reading
  • Clearview AI fined millions in the UK: No 'lawful reason' to collect Brits' images
    Notorious selfie-scraper must pay $9.43 million – less than half of predicted fine – says data regulator

    Updated The UK's data protection body today made good on its threat to fine controversial facial recognition company Clearview AI, ordering it to stop scraping the personal data of residents from the internet, delete what it already has, and pay a £7.5 million ($9.43 million) fine.

    The company, which is headquartered in New York, claims to have over 20 billion facial images on its databases, mostly culled from YouTube, Facebook, and Twitter. Clearview AI has developed a facial recognition tool – which it is attempting to patent – that is trained on these images. The tool attempts to match faces fed into its machine learning software with results from its enormous image database, which it claims is the largest of its kind "in the world" and which it sells (to law enforcement bodies, among other clientele) across the globe.

    The move from the Information Commissioner's Office (ICO) comes after an investigation launched in 2020 in conjunction with the Australian Information Commissioner to see if Clearview had breached the Australian Privacy Act or the UK Data Protection Act 2018.

    Continue reading
  • Did ID.me hoodwink Americans with IRS facial-recognition tech?
    Senators want the FTC to investigate "evidence of deceptive statements"

    Democrat senators want the FTC to investigate "evidence of deceptive statements" made by ID.me regarding the facial-recognition technology it controversially built for Uncle Sam.

    ID.me made headlines this year when the IRS said US taxpayers would have to enroll in the startup's facial-recognition system to access their tax records in the future. After a public backlash, the IRS reconsidered its plans, and said taxpayers could choose non-biometric methods to verify their identity with the agency online.

    Just before the IRS controversy, ID.me said it uses one-to-one face comparisons. "Our one-to-one face match is comparable to taking a selfie to unlock a smartphone. ID.me does not use one-to-many facial recognition, which is more complex and problematic. Further, privacy is core to our mission and we do not sell the personal information of our users," it said in January.

    Continue reading
  • Clearview AI promises not to sell face-recognition database to most US businesses
    Caveats apply, your privacy may vary

    Clearview AI has promised to stop selling its controversial face-recognizing tech to most private US companies in a settlement proposed this week with the ACLU.

    The New-York-based startup made headlines in 2020 for scraping billions of images from people's public social media pages. These photographs were used to build a facial-recognition database system, allowing the biz to link future snaps of people to their past and current online profiles.

    Clearview's software can, for example, be shown a face from a CCTV still, and if it recognizes the person from its database, it can return not only the URLs to that person's social networking pages, from where they were first seen, but also copies that allow that person to be identified, traced, and contacted.

    Continue reading
  • Research finds data poisoning can't defeat facial recognition
    Someone can just code an antidote and you're back to square one

    If there was ever a reason to think data poisoning could fool facial-recognition software, a recently published paper showed that reasoning is bunk.

    Data poisoning software alters images by manipulating individual pixels to trick machine-learning systems. These changes are invisible to the naked eye, but if effective they make the tweaked pictures useless to facial-recognition tools – whatever is in the image can't be recognized. This could be useful for photos uploaded to the web, for example, to avoid recognition. It turns out, this code may not be that effective.

    Researchers at Stanford University, Oregon State University, and Google teamed up for a paper in which they single out two particular reasons why data poisoning won't keep people safe. First, the applications written to "poison" photographs are typically freely available online and can be studied to find ways to defeat them. Second, there's no reason to assume a poisoned photo will be effective against future recognition models.

    Continue reading
  • UK police lack framework for adopting new tech like AI and face recognition, Lords told
    Governance structure is 'a bush, not a tree' – whatever that means

    UK police forces have no overarching rules for introducing controversial technologies like AI and facial recognition, the House of Lords has heard.

    Baroness Shackleton of the Lords' Justice and Home Affairs Committee said the group had found 30 organisations with some role in determining how the police use new technologies, without any single body to guide and enforce the adoption of new technologies.

    Under questioning from the Lords, Kit Malthouse, minister for crime and policing, said: "It is complicated at the moment albeit I think most [police] forces are quite clear about their own situation."

    Continue reading
  • 1,000-plus AI-generated LinkedIn faces uncovered
    More than 70 businesses created fake profiles to close sales

    Two Stanford researchers have fallen down a LinkedIn rabbit hole, finding over 1,000 fake profiles using AI-generated faces at the bottom.

    Renée DiResta and Josh Goldstein from the Stanford Internet Observatory made the discovery after DiResta was messaged by a profile reported to belong to a "Keenan Ramsey". It looked like a normal software sales pitch at first glance, but upon further investigation, it became apparent that Ramsey was an entirely fictitious person.

    While the picture appeared to be a standard corporate headshot, it also included multiple red flags that point to it being an AI-generated face like those generated by websites like This Person Does Not Exist. DiResta was specifically tipped off by the alignment of Ramsey's eyes (the dead center of the photo), her earrings (she was only wearing one) and her hair, several bits of which blurred into the background. 

    Continue reading

Biting the hand that feeds IT © 1998–2022