back to article Detroit Police make second wrongful facial-recog arrest when another man is misidentified by software

A man was charged after he was mistakenly identified as a thief by facial-recognition software used by the Detroit Police Department. Michael Oliver, a resident of the Motor City, was arrested last year for a crime he didn't commit. He was accused of reaching into a car window to snatch and destroy a cellphone. But the images …

  1. Nameless Dread
    Facepalm

    Fish ? Scale ?

    1. John Brown (no body) Silver badge

      That might be a red herring/

  2. Anonymous Coward
    Anonymous Coward

    It seems that

    only China is good at reliable facial recognition so why don't we ask them how to do it properly ?

    1. WolfFan Silver badge

      Re: It seems that

      Their software puts everyone into one of two categories: Han Chinese and foreign devils. Foreign devils are automatically guilty. Han Chinese are further put into two categories: Party members and imperialist stooges. Imperialist stooges are automatically guilty. Party members are graded by rank in the party and their connections. The lower the rank or the worse the connections, the more likely to be guilty the target is.

      And no, it doesn’t matter what the alleged offense might be.

      1. Anonymous Coward
        Anonymous Coward

        Re: It seems that

        Isn't there a third category called Uyghurs that gets shipped to "re-education" camps as soon as they're detected anywhere?

        Sorry, can't keep up. That bastion of "freedom" called US is presently making enough of a mess to keep me busy :(.

  3. Cynic_999

    Confirmation bias

    The police have latched onto this because they desperately *want* it to work. It is a wet dream for law enforcement - mass surveillance without needing any manpower at all. So confirmation bias means that they will ignore all evidence that suggests that it might not do everything that it claims to be able to do.

    Same reason the military bought thousands of cheap fake mine detectors (though in that case a false negative has consequences that are a tad more dire).

    I suspect the next wundermaschine will be a device that can detect guilt or innocence, allowing us to dispense with costly and time-consuming trials.

    At which point I will invent a device that causes a person to believe that they have spent the last X years in prison - allowing instant punishment without the expense of any physical prisons.

    Then random people can be pulled in on the say-so of fake facial recognition, given a fake instant "guilty" verdict, given a fake punishment and then released all in the same day. The justice department can declare that we have a 100% clear-up rate for all crimes while saving huge amounts of money. Everyone's satisfied.

    1. Doctor Syntax Silver badge

      Re: Confirmation bias

      "The police have latched onto this because they desperately *want* it to work."

      It's more universal than that. It just rides on the back of the bewitching effect of anything that displays numbers. People just believe the numbers without giving any thought to how the numbers get there.

      I first realised this effect when digital balances arrived. I had a couple of OU students who questioned the use of the mechanical balance in the original (S100) Science Foundation Course kit; why bother with that when they had nice digital balances in the school where they taught. My reaction to getting a digital balance in the lab was to get some known good weights to check it.

      1. Tom 7

        Re: Confirmation bias

        I remember (mid 60s?) when a geology prof friend of my Dads came round with a new toy. It was a gravimeter. It could tell the difference in gravity between a seat chair and the floor! I recently saw a set of scales that could do the same and it wasn't that expensive. I wonder if your digital balance would argue with the weights and not be wrong.

        1. Doctor Syntax Silver badge

          Re: Confirmation bias

          "I recently saw a set of scales that could do the same and it wasn't that expensive."

          We had a lab balance that could tell when you were leaning over it. But that might have been more to do with being on the first floor of the nasty pre-fab building.

          But you reinforce my point. A balance should tell you about mass. The mass doesn't change when move from the bench to the floor even if the force on it does. A real balance compares one mass with a standard mass so that gravitational field effect doesn't matter; it's the same for both masses.

          BTW our building was next door to the local weights and measures dept. so I borrowed their standard weights.

    2. KBeee

      Re: Confirmation bias

      I blame it on TV. With shows such as CSI, NCIS and Criminal Minds showing magic as if it's real technology sowing the seeds of infallible tech to solve problems.

  4. cornetman Silver badge
    Facepalm

    This is a pretty idiotic case.

    Regardless of the fact that one had tattoos and the other didn't, just a cursory glance at the two photos makes it obvious that they are two different people.

    1. Why Not?

      Agree I'm not a face recognition expert but from the linked stories photos The Forehead, Cheekbones, Lips and most definitely the chin are to me visibly different.

      The output from the system the "Investigative lead report" has a digital image examiner named who should be trained in recognising such discrepancies and should have withheld the report until they could be certain they were a 99% match. Or at least explaining why the match was made if it was so different.

      It may be that the photos are of the arrested person as a younger chap without tattoos but this should be part of the scoring.

  5. Anonymous Coward
    Anonymous Coward

    Easier to make everyone guilty until proven innocent. That's where we are heading...

    1. Youngone Silver badge

      Who is this we? It sounds like parts of the US might be, but not where I live.

      1. Doctor Syntax Silver badge

        I don't think US govts have sole possession of that objective.

  6. Anonymous Coward
    Big Brother

    The real news

    The real news would be the times facial recognition actually worked correctly.

    That would be something the police and the software company would trumpet.

    The silence tells me it has never happened.

    1. doublelayer Silver badge

      Re: The real news

      That's not true. It happened. It really did. It was nine months ago in a private test somewhere in Europe. And they detected this guy entirely correctly. Well, he wasn't the guy they were looking for, but he was an identical twin with that guy, almost. I mean we put this guy in a lineup, brought in some people, and asked them to look at a picture and point out which of the people in the line was that guy. Everyone pointed at him except for a few of them, but those people didn't select anybody so they don't count.

  7. sanmigueelbeer
    Thumb Down

    A judge dismissed the case after prosecutors were convinced that there had been a misidentification

    Wait, what? It took no less than a judge to dismiss the case vs the police drop the case and let the poor fella go?

    Can someone please explain why it wasted so many hours/days? I mean, look at Exhibit A (picture of the culprit). Look at Exhibit B (picture of the poor fella). Big difference. Not the guy you are looking for. Let him go with an "apology".

    1. martinusher Silver badge

      Fella?

      To give the police some credit the alleged offense took place a year ago and so there was plenty of time to get tats and the like.

      On the other hand, the police know how unreliable eye-witness identifcation is so they shouldn't expect any better from software. You just can't bust someone on a single witness statement (although it won't stop people from trying), you have to have other evidence (and 'being black' isn't evidence.....). To cap it all they obviously used the threat of a felony conviction to try to force the fellow to plead guilty to a misdamenor which pretty much tells us that they had a weak to non-existent case. (Depending on where he lived he'd have to stump up money to a bail bondsman to stay out of jail while his case is being processed -- that's tantamount to jailing or fining a person up front.)

      Somewhere in the DA's office a head needs to roll.....

      1. Joe Montana

        Re: Fella?

        It's not so much the software that's at fault, what's at fault is officers trusting its results blindly. The software is a tool, and all it can do is reduce the number of photos that you need to check manually. You still need to do the actual detective work.

        This guy needs to sue for wrongful arrest. If the costs start stacking up they will have an incentive to improve officer training and deal with incompetent/lazy officers. If you don't hit them in the budget, nothing will change.

    2. eldakka

      Wait, what? It took no less than a judge to dismiss the case vs the police drop the case and let the poor fella go?

      That was pretty much my reaction too.

      I can sorta (though I don't think it's acceptable still) imagine an arrest warrant being drawn up and auto-filled as it were.

      But surely between the actual arrest and before charges were formally filed, someone would have thrown a human eye over the evidence and released the wrongfully accused suspect without charge?

  8. Winkypop Silver badge
    Big Brother

    Arrest everyone

    Only let them go if nothing will stick

  9. Scott Broukell

    You have been found guilty of causing our shiny new facial recognition system to malfunction by means of your own deliberate actions posing as another individual and causing a misidentification event to occur with the result that a potential miscarriage of justice may, or will, have occurred and valuable police time and court facilities have been wasted. It is therefore the courts final decision that you will need to be terminated forthwith in order to prevent further such errors occurring in the future. Next ....

    1. onemark03

      Ssecond wrongful facial-recog arrest

      Americans have an implicit faith in technology - apparently whether it works or not.

      Look at the intelligences failures of the CIA due a failure of technology and their failure to rely on "humint".

      1. Uncle Slacky Silver badge

        Re: Ssecond wrongful facial-recog arrest

        They also seem to think that lie detector machines actually work.

  10. TheRealRon

    I am a fan of The Register's excellent reporting on these issues. But I don't understand why this report did not mention that Michael Oliver is a black man.

    The Register has done great work to call out the mis-use of technology that promotes racist behaviour; I don't understand why in this case they have not included the key details needed for a reader to understand that this story relates to racism. Failing to call out racism has the same result as promoting racism.

    1. Joe Montana

      No, because racism is not the issue here.

      Faulty software and incompetent/lazy cops are the issue at hand.

      Trying to blame racism when there is no evidence of that is creating unnecessary divisions in society, and diverting attention away from the actual issue being raised.

      1. Intractable Potsherd

        All good points, @Joe, but there is a bias in facial recognition systems, widely reported here, that works to disadvantage people with darker skin. I'm reluctant to call that racism, but skin colour is relevant to the story here.

        1. Doctor Syntax Silver badge

          "but skin colour is relevant to the story here"

          Only is as far as it affects poor software.

          1. Intractable Potsherd

            That's my point.

  11. Danny 2

    I worked in Burr Brown's Digital Signal Processing dept in the '80s when they thought they'd cracked facial recognition. It wasn't a product, just a test of the hardware, but they were chuffed. They'd only tested it on themselves though - a scrawny wee white guy with a beard, a scrawny wee white guy without a beard, a big blond viking guy with a beard, a James Bond type, a couple of similarly disparate guys.

    They tried it on a blonde female employee and it identified her as the viking guy. I was glad it failed.

    Apparently the latest Chinese system can identify people wearing facemasks. And all Chinese people look alike to me. (Not racist. I knew a teacher who was one of two white teachers in an all black Caribbean school. She was short, dumpy and blonde, the other white teacher was a tall, skinny brunette, but the kids always mistook for each other. We aren't as observant as we think we are.)

  12. teebie

    "It released statistics that showed average scores were actually higher compared to last year's, and that the distribution of grades was similar to last year's results too."

    That's what would happen if that what the software was trained to do. It doesn't mean the right grades were given to the right person.

  13. Anonymous Coward
    Anonymous Coward

    > IB said the software takes into account test scores from previous exams, but has provided little transparency into how it actually works. It released statistics that showed average scores were actually higher compared to last year's, and that the distribution of grades was similar to last year's results too.

    This kind of system cries out for open-source development. The implementation needs to be open and transparent, so that everybody can see and understand how it works. This precludes any kind of so-called AI, because that by definition is a black box of uncertainty.

    1. Doctor Syntax Silver badge

      "that by definition is a black box of uncertainty"

      Are you allowed to say that these days?

  14. Scroticus Canis
    Unhappy

    So next time the tuna will be guilty and the Detriot guy ends up in the sushi?

    Sigh! I used to like this planet. Not so much lately.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like