Fish ? Scale ?
Detroit Police make second wrongful facial-recog arrest when another man is misidentified by software
A man was charged after he was mistakenly identified as a thief by facial-recognition software used by the Detroit Police Department. Michael Oliver, a resident of the Motor City, was arrested last year for a crime he didn't commit. He was accused of reaching into a car window to snatch and destroy a cellphone. But the images …
COMMENTS
-
-
-
Monday 13th July 2020 15:59 GMT WolfFan
Re: It seems that
Their software puts everyone into one of two categories: Han Chinese and foreign devils. Foreign devils are automatically guilty. Han Chinese are further put into two categories: Party members and imperialist stooges. Imperialist stooges are automatically guilty. Party members are graded by rank in the party and their connections. The lower the rank or the worse the connections, the more likely to be guilty the target is.
And no, it doesn’t matter what the alleged offense might be.
-
-
Monday 13th July 2020 16:27 GMT Cynic_999
Confirmation bias
The police have latched onto this because they desperately *want* it to work. It is a wet dream for law enforcement - mass surveillance without needing any manpower at all. So confirmation bias means that they will ignore all evidence that suggests that it might not do everything that it claims to be able to do.
Same reason the military bought thousands of cheap fake mine detectors (though in that case a false negative has consequences that are a tad more dire).
I suspect the next wundermaschine will be a device that can detect guilt or innocence, allowing us to dispense with costly and time-consuming trials.
At which point I will invent a device that causes a person to believe that they have spent the last X years in prison - allowing instant punishment without the expense of any physical prisons.
Then random people can be pulled in on the say-so of fake facial recognition, given a fake instant "guilty" verdict, given a fake punishment and then released all in the same day. The justice department can declare that we have a 100% clear-up rate for all crimes while saving huge amounts of money. Everyone's satisfied.
-
Monday 13th July 2020 17:38 GMT Doctor Syntax
Re: Confirmation bias
"The police have latched onto this because they desperately *want* it to work."
It's more universal than that. It just rides on the back of the bewitching effect of anything that displays numbers. People just believe the numbers without giving any thought to how the numbers get there.
I first realised this effect when digital balances arrived. I had a couple of OU students who questioned the use of the mechanical balance in the original (S100) Science Foundation Course kit; why bother with that when they had nice digital balances in the school where they taught. My reaction to getting a digital balance in the lab was to get some known good weights to check it.
-
Monday 13th July 2020 18:05 GMT Tom 7
Re: Confirmation bias
I remember (mid 60s?) when a geology prof friend of my Dads came round with a new toy. It was a gravimeter. It could tell the difference in gravity between a seat chair and the floor! I recently saw a set of scales that could do the same and it wasn't that expensive. I wonder if your digital balance would argue with the weights and not be wrong.
-
Tuesday 14th July 2020 16:28 GMT Doctor Syntax
Re: Confirmation bias
"I recently saw a set of scales that could do the same and it wasn't that expensive."
We had a lab balance that could tell when you were leaning over it. But that might have been more to do with being on the first floor of the nasty pre-fab building.
But you reinforce my point. A balance should tell you about mass. The mass doesn't change when move from the bench to the floor even if the force on it does. A real balance compares one mass with a standard mass so that gravitational field effect doesn't matter; it's the same for both masses.
BTW our building was next door to the local weights and measures dept. so I borrowed their standard weights.
-
-
-
-
-
-
Tuesday 14th July 2020 10:57 GMT Why Not?
Agree I'm not a face recognition expert but from the linked stories photos The Forehead, Cheekbones, Lips and most definitely the chin are to me visibly different.
The output from the system the "Investigative lead report" has a digital image examiner named who should be trained in recognising such discrepancies and should have withheld the report until they could be certain they were a 99% match. Or at least explaining why the match was made if it was so different.
It may be that the photos are of the arrested person as a younger chap without tattoos but this should be part of the scoring.
-
-
-
Monday 13th July 2020 23:01 GMT doublelayer
Re: The real news
That's not true. It happened. It really did. It was nine months ago in a private test somewhere in Europe. And they detected this guy entirely correctly. Well, he wasn't the guy they were looking for, but he was an identical twin with that guy, almost. I mean we put this guy in a lineup, brought in some people, and asked them to look at a picture and point out which of the people in the line was that guy. Everyone pointed at him except for a few of them, but those people didn't select anybody so they don't count.
-
-
Tuesday 14th July 2020 01:00 GMT sanmigueelbeer
A judge dismissed the case after prosecutors were convinced that there had been a misidentification
Wait, what? It took no less than a judge to dismiss the case vs the police drop the case and let the poor fella go?
Can someone please explain why it wasted so many hours/days? I mean, look at Exhibit A (picture of the culprit). Look at Exhibit B (picture of the poor fella). Big difference. Not the guy you are looking for. Let him go with an "apology".
-
Tuesday 14th July 2020 03:04 GMT martinusher
Fella?
To give the police some credit the alleged offense took place a year ago and so there was plenty of time to get tats and the like.
On the other hand, the police know how unreliable eye-witness identifcation is so they shouldn't expect any better from software. You just can't bust someone on a single witness statement (although it won't stop people from trying), you have to have other evidence (and 'being black' isn't evidence.....). To cap it all they obviously used the threat of a felony conviction to try to force the fellow to plead guilty to a misdamenor which pretty much tells us that they had a weak to non-existent case. (Depending on where he lived he'd have to stump up money to a bail bondsman to stay out of jail while his case is being processed -- that's tantamount to jailing or fining a person up front.)
Somewhere in the DA's office a head needs to roll.....
-
Tuesday 14th July 2020 09:37 GMT Joe Montana
Re: Fella?
It's not so much the software that's at fault, what's at fault is officers trusting its results blindly. The software is a tool, and all it can do is reduce the number of photos that you need to check manually. You still need to do the actual detective work.
This guy needs to sue for wrongful arrest. If the costs start stacking up they will have an incentive to improve officer training and deal with incompetent/lazy officers. If you don't hit them in the budget, nothing will change.
-
-
Tuesday 14th July 2020 05:23 GMT eldakka
Wait, what? It took no less than a judge to dismiss the case vs the police drop the case and let the poor fella go?
That was pretty much my reaction too.
I can sorta (though I don't think it's acceptable still) imagine an arrest warrant being drawn up and auto-filled as it were.
But surely between the actual arrest and before charges were formally filed, someone would have thrown a human eye over the evidence and released the wrongfully accused suspect without charge?
-
-
Tuesday 14th July 2020 08:18 GMT Scott Broukell
You have been found guilty of causing our shiny new facial recognition system to malfunction by means of your own deliberate actions posing as another individual and causing a misidentification event to occur with the result that a potential miscarriage of justice may, or will, have occurred and valuable police time and court facilities have been wasted. It is therefore the courts final decision that you will need to be terminated forthwith in order to prevent further such errors occurring in the future. Next ....
-
Tuesday 14th July 2020 09:08 GMT TheRealRon
I am a fan of The Register's excellent reporting on these issues. But I don't understand why this report did not mention that Michael Oliver is a black man.
The Register has done great work to call out the mis-use of technology that promotes racist behaviour; I don't understand why in this case they have not included the key details needed for a reader to understand that this story relates to racism. Failing to call out racism has the same result as promoting racism.
-
Tuesday 14th July 2020 09:56 GMT Danny 2
I worked in Burr Brown's Digital Signal Processing dept in the '80s when they thought they'd cracked facial recognition. It wasn't a product, just a test of the hardware, but they were chuffed. They'd only tested it on themselves though - a scrawny wee white guy with a beard, a scrawny wee white guy without a beard, a big blond viking guy with a beard, a James Bond type, a couple of similarly disparate guys.
They tried it on a blonde female employee and it identified her as the viking guy. I was glad it failed.
Apparently the latest Chinese system can identify people wearing facemasks. And all Chinese people look alike to me. (Not racist. I knew a teacher who was one of two white teachers in an all black Caribbean school. She was short, dumpy and blonde, the other white teacher was a tall, skinny brunette, but the kids always mistook for each other. We aren't as observant as we think we are.)
-
Tuesday 14th July 2020 10:35 GMT teebie
"It released statistics that showed average scores were actually higher compared to last year's, and that the distribution of grades was similar to last year's results too."
That's what would happen if that what the software was trained to do. It doesn't mean the right grades were given to the right person.
-
Tuesday 14th July 2020 14:08 GMT Anonymous Coward
> IB said the software takes into account test scores from previous exams, but has provided little transparency into how it actually works. It released statistics that showed average scores were actually higher compared to last year's, and that the distribution of grades was similar to last year's results too.
This kind of system cries out for open-source development. The implementation needs to be open and transparent, so that everybody can see and understand how it works. This precludes any kind of so-called AI, because that by definition is a black box of uncertainty.