So who are the lucky 10'000 who're going to get collared by the false positives?
Smile! UK cops reckon they've ironed out gremlins with real-time facial recog
Police in the UK are preparing to reintroduce real-time facial recognition technology after a report found the latest versions of software used by law enforcement have improved accuracy and have fewer false positives. The report [PDF] from the National Physical Laboratory found that when face-match thresholds in Neoface were …
COMMENTS
-
Thursday 6th April 2023 17:20 GMT fidodogbreath
1 in 6000?
the chance of a false match is just 1 in 6,000 people who pass the camera
That's actually a pretty bad false match rate. A busy street in a retail district can average 5000 pedestrians per hour (or more) passing dozens of cameras. Seems like a lot of people can expect to have their day ruined by the polizei.
-
-
Thursday 6th April 2023 23:21 GMT Simian Surprise
Re: 1 in 6000?
Ok, but think of it from the opposite perspective: you've got dozens of potential matches to known criminals a day (and I think you're low-balling it, even). There's not going to be even close to enough cops to deal with all those reports. So now they start having to triage whom to go after, they send an officer after an innocent look-alike, the "bad guy" (yes, I'm being very generous and assuming arguendo that this is to catch criminals) walks by 10 minutes later and bang! all we've done is waste police time.
I struggle to think of a way in which this can go well, even from the cops' perspective.
-
Friday 7th April 2023 15:16 GMT Persona
Re: 1 in 6000?
so now they start having to triage whom to go after, they send an officer after an innocent look-alike
No. Triage means that they look at the "matched" picture and say "don't think so" for 99 out of 100 spending about 2 seconds each. For the lucky 1% they get get another officer who is good at recognizing faces to look to see if they agree. Only if they do does that turn into a chat with the officer.
-
-
-
Friday 7th April 2023 02:11 GMT Falmari
Re: 1 in 6000?
What does 1 in 6000 mean? Is it 6000 different people who pass the camera or just 6000 passes of the camera meaning some of those will include multiple passes by the same people.
Because if it is just 1 in 6000 passes does that not mean there is 1 in 6000 that an individual will be falsely matched, pass enough cameras you are going to be flagged.
Also what was the size of the image data set being searched? I have a sneaking suspicion, the larger the data set being searched the higher the false positive rate will be.
-
-
Thursday 6th April 2023 20:49 GMT TheMaskedMan
1 in 6000 is appalling. Clearly Blackstone's ratio has no meaning these days.
One would think that the bobbies would be busy enough without adding dozens of false arrests every day - after all, there are doughnuts to eat, and twitter doesn't police itself, especially these days.
It would be interesting to know how many correct matches the thing picks up, on average, in a day, and how many faces it tries to identify.
It would also be interesting to know how plod responds to a face being recognised. Presumably they would send an officer to the suspect's location. Do they then request identification, or just nick em on the spot? And how would that work in relation to PACE?
-
Friday 7th April 2023 15:59 GMT Juillen 1
Where does this "arrest" come from? An "identify" is "flag to check, as this may be a person of interest". For some reason, people here are having the false idea that this flag will mean police will rush out and arrest the person directly, with no further checks.
First would likely be image check of "Person of Interest" via records (human verification that the match indeed looks like recorded photos of the person of interest).
Then if resources allow, a quick check for ID if resource by sending a local beat patrol officer. I've been asked for ID (and had a stop search of bags) a few times in my life, and each time has been short and sweet; I rather suspect that this evolution of the facial recognition will cut down on the random searches a fair bit and skew towards people more likely (from historical records) to be a problem.
If the ID checks out as not the person suspected, or the officer has no reason to believe that the person is "of interest", then they go on their way after a couple of minutes delay.
If everything in the chain (detection, first visual correlation, ID checks etc.) prove that the person is still "of interest", then things would progress exactly as if they were visually ID'd on the street by a beat officer.
As far as PACE goes, it's (from what I can see) a way to target available police time towards more likely productive areas. Police funding has been massively cut back, resulting in a requirement to "do more with less". This is one of the ways that more can be done with less. Is it perfect? No. But there again, nothing is perfect, and insisting it must be is simply an "appeal to utopia" logical fallacy.
As with most things, the truth is rarely sensational. It's usually very mundane. Occam's razor would suggest that this doesn't override PACE, or affect normal operation, merely pointing out that there is a suspected person of interest in the area. Standard policy would apply from that point on.
-
Wednesday 12th April 2023 16:49 GMT Graham Cobb
It isn't 1 in 6000 times you/me/whomever pass a camera. It is 1 in 6000 people who pass a camera at all.
So, 1 in 6000 people will be stopped every time they go out because they "look a bit like" someone on a list?
The whole concept of using public surveillance in policing is abhorrent and appalling.
-
-
-
Thursday 6th April 2023 22:54 GMT Ian Mason
To put that 1 in 6000 false positives into perspective. If they used this at Stratford station in Newham, where they have deployed it previously, that would result in 54 false positives a day. Over fifty innocent people each day would be detained, questioned and have to prove that they weren't the person the police would be insisting they were. Not good enough.
-
Friday 7th April 2023 16:04 GMT Juillen 1
Do you have evidence for the assertion? The usual process is "Take image from automated match, check against records using a human for verification, then dispatch".
That would mean that you'd have at least a check in advance. Then, you could use your current argument that "Oh, PACE is no good, because human stop and search with an individual recollection is imperfect, therefore (x) individuals are stopped and searched while innocent.".
Where do you get the idea that Police would insist that someone was a guilty person, more than if they were stopped and searched, or were stopped by an officer's memory associating them as someone of interest?
Not good enough.
-
-
-
Friday 7th April 2023 16:07 GMT Juillen 1
Re: So
Incorrect. People are matched, checked, then if there is utility, an officer dispatched to check again and determine likelihood of match being correct.
5999 of 6000 (or better) are not even of passing interest. 1 in 6000 approximately is looked at and checked. Much lower than 1 in 6000 is actually a suspect. Lazy arguing.
-
-
Friday 7th April 2023 13:46 GMT disgruntled yank
Wonderful
For those who have NY Times subscriptions, or can find a way around the paywall, I recommend https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.html . The man discussed in the article was arrested in Atlanta for theft in New Orleans, though he had never been there. He was released after six days in jail, but only because a lawyer retained by his family was able to demonstrate to the New Orleans police that the automatic match was not really a good one.
-
Friday 7th April 2023 14:20 GMT Anonymous Coward
I can imagine
cops being all for this. I've had relatives in the Police. But it feels like an assumption as offensive as any you could make. I feel like some kind of bigot for thinking it, because this kind of surveillance is so obviously evil.
And I suppose I'm worn down. It's as futile arguing this stuff as trying to debate religion with the Jehovahs at the door.
-
Monday 10th April 2023 15:11 GMT D Moss Esq
Template ageing
The National Physical Laboratory's Dr Tony Mansfield is careful to warn in his report that "images were collected from Cohort subjects over one or two days", see para.9.6.
As he says, "TPIR rates for facial recognition against a recent photograph are likely to be better than TPIR against historic photographs".
And how! In his 2003 report 20 years ago Dr Mansfield warned that once images in the gallery are more that about two months old facial recognition biometrics technology becomes pretty well useless, see para.52(c). Two months after the NPL's test using fresh images, those nice big True Positive Identity Rates around the 80% mark are likely to collapse.
As the watchlists get bigger and the images get older, the returns are likely to diminish, quickly, to the point where the live facial recognition exercise becomes pointless.
We may wish that the police didn't spend money pointlessly but it's not guaranteed. Let's hope that in this case they will can the operation.
The police are more likely to can the operation if critics would stop complaining so much about privacy issues. As the biometrics technology doesn't work, there aren't really any privacy issues. It's likely to be more effective to point out that the exercise wastes our money and makes the police look like the silly dupes of the biometrics companies peddling flaky gadgets.
"Smile! UK cops reckon they've ironed out gremlins with real-time facial recog"? Goodness knows where that headline came from. Certainly not from the NPL report.