back to article Cops love facial recognition, and withholding info on its use from the courts

Police around the United States are routinely using facial recognition technology to help identify suspects, but those departments rarely disclose they've done so - even to suspects and their lawyers.  Documents concerning the use and disclosure, of facial recognition technology were provided to the Washington Post as part of …

  1. IGotOut Silver badge

    The US justice system...

    ...where the more in prison, the better for the gravy train.

    Innocent? Guilty? Who cares. Gotta make a profit.

    1. Helcat Silver badge

      Re: The US justice system...

      Less about profit and more about meeting targets: When conviction rates are used to mark success, does it matter if the person convicted was the right person, or even if a crime had occurred?

      If the investigation isn't sound, then why expose the poor quality of evidence? Just push ahead and get that important conviction and let the person contest it, if they feel it worth their while. It's the same approach as pressuring the accused into confessing by pointing out they'll be locked up for X time while waiting for their court appearance which will be longer than the time they'll serve in prison for that offence : This is why so many of the UK 'rioters' were convicted so quickly: They were given the choice of waiting months for their court appearance, or to admit it and be free before they'd appear in court. When the conviction doesn't carry a CRB mark, it's not worth waiting so admit the crime, right? That it does leave a record and that can be used in a later prosecution isn't mentioned, and that gives the police more leverage to get a confession later for a different case.

      That's why I'm sceptical about confessions: It's not always certain that the confession was genuine or coerced.

      1. MachDiamond Silver badge

        Re: The US justice system...

        "That's why I'm sceptical about confessions: It's not always certain that the confession was genuine or coerced."

        These days the deals being offered can be so good that it's better to confess. If you have been naughty and can get a felony cut down to a $500 fine as part of a plea, bonus. The cost for your attorney will be a lot more than that, but better than years behind bars playing Bubu's new special friend.

  2. Sora2566 Silver badge

    At this point, I'm not sure the US police would obey if the technology was banned outright.

    1. Yet Another Anonymous coward Silver badge

      ED-209 trained only on uniform cop ID card photos sent onto the streets to find a white suspect disguised as a police officer

  3. martinusher Silver badge

    Facial Recognition Alone....

    ....is not viable. There has to be other information to confirm the hunch.

    I can't see much difference between a computer and a person -- especially one of those 'super recognizers' -- identifying a person. The big danger is relying exclusively on the computer -- just because it fingers you should never mean that you've been positively identified.

    1. Neil Barnes Silver badge
      Holmes

      Re: Facial Recognition Alone....

      It does sound as if the FR software is given a huge pile of pictures to choose from, and picks _one_. If it suggested a couple of dozen at least, with variable levels of confidence, then that might be acceptable as a basis to start investigating... but 'computer says you' is no way to identify a suspect. Human witnesses are notoriously vague about identification and it sounds as if robot witnesses are no better - but they're being treated as if they are.

    2. Anonymous Coward
      Anonymous Coward

      Re: Facial Recognition Alone....

      A human can complain when blamed of an error and has rights.

  4. Gene Cash Silver badge

    Seen it before

    Remember when the cops hid the use of stingray devices as long as they could?

    I remember they'd even drop cases if it would reveal the stingray tech.

    1. Yet Another Anonymous coward Silver badge

      Re: Seen it before

      But that was because they were only used against tourists terrorists

  5. Anonymous Coward
    Anonymous Coward

    Multiple Dangers

    It's corrupt to obscure (lie) the use.

    I bet they are collecting general data from non-suspects - what is deleted and who checks?

    It makes privacy laws a stupid joke that only get used against poor individuals or small business.

    It will be used to justify mistakes - it wasn't us it was the AI.

    Who trains and audits the AI?

    Next step is Minority Report - his body language told the AI he was about to commit a crime.

    Final step your chip implant told us you had a bad thought. We're almost there in Europe with people jailed for social media posts.

    1. Mungo Spanner

      Re: Multiple Dangers

      "We're almost there in Europe with people jailed for social media posts"

      Would those be the posts telling people to set fire to a hotel full of asylum seekers? 'Cos if so, then good.

      And, thanks I suspect to your vote, we are not in Europe - so don't blame them.

      1. Anonymous Coward
        Anonymous Coward

        Re: Multiple Dangers

        "And, thanks I suspect to your vote, we are not in Europe - so don't blame them."

        When was the UK removed from Europe?

        I think you meant the EU lol

    2. Phil Koenig Bronze badge

      Re: Multiple Dangers

      AC:

      It makes privacy laws a stupid joke...

      That would definitely be one of the potential problems - but first you would need some privacy laws to make a joke of.

      We are talking about America here, where privacy laws are so 1990's. Today we worship the Gods of Surveillance Capitalism, and they pay the politicians very well...

  6. Christoph

    They accused someone from a different state. So they are searching across the entire USA for a similar face. That's pretty well guaranteed to find a match somewhere!

    How long before they are extraditing people from the UK because they look vaguely like a criminal in California?

  7. Guy de Loimbard Silver badge
    Meh

    Funding issue or peddling tech?

    Is there such an uptick in crime that Facial Recognition is being used as a tool to reduce investigative time or, is it that most PD's are underfunded and under pressure to perform?

    I don't know about the USA funding for Police Departments, but in the UK you'd be hard pushed to see a Police Officer on the street anymore and I don't know if that's bureaucracy or underfunding.

  8. Eclectic Man Silver badge
    Boffin

    False positives

    Coincidentally I am reading David Speigelhalter's new book 'They Art of Uncertainty' (ISBN 978-0-241-65862-8)* where he considers a hypothetical facial recognition system with a 0.1% false positive rate (pp 197 - 200). The [problem is that so many faces are scanned, and the so few people are on the database of 'wanted criminals' that false positives are often more numerous than correctly identified suspects. I doubt that enough Police Officers are trained appropriately to understand, that although the system has identified someone with 99.9% certainty, the fact that it shifted through images of 20,000 people at the 'scene' means that it is more likely they have arrested an innocent person that a guilty one.

    *Very interesting, but takes a bit of concentration to understand the mathematics.

    1. Yet Another Anonymous coward Silver badge

      Re: False positives

      Also add in that the training are on arrested suspects and the scans are on general public.

      If for some unaccountable reason the proportion of black faces in the 'arrest' set was higher than in the general population it will bias the results.

      Train a model on 50% horses 50% zebras and send it out into a field with one zebra and it's going to pick your 'suspect'

      1. MachDiamond Silver badge

        Re: False positives

        "Train a model on 50% horses 50% zebras and send it out into a field with one zebra and it's going to pick your 'suspect'"

        And this is why you can't replace a proper investigation with a computer. If FR lets you go from zero people of interest to 3-4, that can be highly useful as a starting point. It's then appropriate to start ruling people out. That's not to say that's what they are doing. There was a story where a woman accused to theft was ID'd by FR and arrested. The problem was that it wasn't her and she was very pregnant where the person on the CCTV obviously wasn't. Whoever was doing the investigation should be dismissed for that one. Computers can't cure stupid. The only mitigation is to keep stupid people away from anything important.

    2. martinusher Silver badge

      Re: False positives

      The bad news is that you don't train facial recognition on pictures of suspects or whatever, you let it loose on the interweb and all those millions of pictures people have been uploading for decades now. Apparently this is a very cheap, easy and accurate way to pick a face out of the crowd.

      The news hasn't hit ElReg yet but its on ArsTechnica and theVerge. It was just a hack done by some college students......but its been done before with just ordinary pictures, originally by some Russian startup if I recall correctly.

      The technology works. Pretending that it doesn't or you can make it go away by fiat is wishful thinking. We have to learn to live with it by knowing its limits of use.

  9. spacecadet66 Bronze badge

    Well. nobody could have seen this coming.

    Apart from anyone with a passing familiarity with American police, anyway.

    1. MachDiamond Silver badge

      Re: Well. nobody could have seen this coming.

      "Apart from anyone with a passing familiarity with American police, anyway."

      It has nothing to do with police or being an American. It applies more to politicians and "decision makers". Some company says they have a new tool that will .... blah blah blah. Since that tool will be sold to government, whether the tool works or not is moot as long as they get registered to sell to government and it is also helpful if they can be classed as woman/minority owned or some other class that gives them extra points. Government buys the tool and deploys it while relying on the brochure that tells how good it is.

      I used to see this with young engineers that learned some FEA or CFD software, but didn't learn the underlying concepts to be able to critique the results. I remember one time when I told one that they should be up for a Nobel based on their work. They'd cracked faster than light information transmission. Turns out they didn't, but they also didn't spot the error.

  10. Grinning Bandicoot

    Cameras

    There are a lot of assumptions that are built into any camera derived machine generated IDs. 1) the camera lens is clean and does not have any flaws. 2) The algorithm used adjusts for non optimal view or distortions in its view. An excellent example is Google Earth handling of overpasses; you know those funny shaped dips where two highways cross. 3) Laziness on behalf of the low-bid agency doing the interpretation. Look at enough pictures long enough and they begin to lose their distinctiveness to coin a neologism. Start with one million suspects, winnow down to a dozen and the last half blend together.

    The problem of lens distortion is such that license plates can be misread which are a lot more distinctive than a face. Personal experience here when Los Angeles City sent a Notice of Violation at a certain in location and time while I had a receipt in San Diego 10 minutes prior to noted time. Seems that there was something deposited on the lens that was 'read' as a character.

    But any biologic is going to use the path of least resistance and the only way to correct errors is to raise the resistane such that other paths are more viable because they are easier!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like