Hands up who misread that as 'farcial-recognition technology', as I just did :)
Cops in Detroit have admitted using facial-recognition technology that fails to accurately identify potential suspects a whopping 96 per cent of the time. The revelation was made by the American police force's chief James Craig during a public hearing, this week. Craig was grilled over the wrongful arrest of Robert Williams, …
Having posted a link to the Ars Tecnica article discussing this back on 30 June (in a comment on NeoFace), This is old news to me but I will repost my concluding comments (modified using Pink Duck's brilliant Spoonerism).
Farcial Recognition is, always has been, and always will be fraught with error.
Farcial Recognition is, always has been, and always will be an invasion of privacy.
The reason your accuracy is so high is because: a.) the human brain is hardwired to do this and b.) your training sample set (people you know) is orders of magnitudes smaller than a useful police database. Also you tend to constantly refresh your internal training data when you interact with the people you know!
And if you misidentify me, I'm probably not going to automatically go to prison. Application matters just as much as technology and capabilities.
your training sample set (people you know) is orders of magnitudes smaller than a useful police database
This is what people miss. Facial recognition will require a 0% false match rate before it can be usable with a huge country wide (or even city wide in a place the size of London) database if you want to match a face to that whole database.
The way most people are exposed to facial recognition is "matching (potentially) many faces to one", i.e. unlocking their phone. Apple claims a 1 in 50,000 false match rate or something like that, so if 50,000 people try to unlock my phone, one person might falsely match the data it has stored about my face and unlock for them. Since thousands of people aren't trying to unlock my phone I'm quite comfortable with that. If someone was able to unlock my phone with their face, I don't get arrested and potentially imprisoned for a crime I didn't commit. Worst case they can send a few texts pretending to be me and piss friends off until I can explain.
Using it to match a suspect's face to a broad database is doing it the other way around, and that 1 in 50,000 that sounds great for unlocking a phone is a real problem if the database is the something like "everyone who has ever been booked in London", or "everyone who has ever been booked in the entire USA". Unless you were able to greatly limit the scope (i.e. if you were able to subpoena cell phone tower data and limit the list to 68 people who were in the area at the time) it will generate many false matches. A smart cop would know to ignore it when it generates so many false matches. A crooked cop would use it as an opportunity to look through the list of false matches, find someone with a long record, decide "he probably did it", so they can close the case.
If someone was able to unlock my phone with their face, I don't get arrested and potentially imprisoned for a crime I didn't commit.
Careful now...that's a naive statement in an otherwise spot on post. If someone truly malicious manages to unlock your phone (which is registered nearly everywhere as yours) all they need to do is a few carefully crafted searches / posts to put you in prison. Good luck claiming you weren't the one doing that at trial when the judge / jury* is presented with that 50000:1 chance that the phone unlocked for anyone else.
This is just a long way of saying it's a bad idea for governments to assume a digital device is, for all intents and purposes, the same thing as the individual. Of course they do it anyway, and what the digital device does is permanently added to the individual's (mostly hidden) lifelong record. Very bad spot to be and one reason I detest mobile phones.
* If present. They may not be depending on what was actually posted!
OK granted, but the odds of someone BOTH beating the 1 in 50,000 odds to falsely match your face AND having evil intent to frame you for a crime you didn't commit are really really long. Like "I should buy a lottery ticket" long.
Someone with evil intent would be better off trying to compromise your email, use SIM swap fraud to steal your phone number, shoulder surf your phone's password (easier to do when we're wearing masks in public and can't rely on face matching) or use one some type of weakness in the face matching to fool it with a fake face rather than just hoping their face is lucky enough to work.
So unless I have an evil twin I'm not going to lose sleep over someone matching my face framing me as a terrorist.
Whilst the chances of it happening are small, the impact on you would be huge. Risk analysis is based on both factors.
For example, if something is likely to happen, but the impact is low (such as a <1kg meteor entering the atmosphere) it can be considered a lower risk that something that is unlikely to happen, but the impact is huge (such as a >1 tonne meteor entering the atmosphere).
We don't lose sleep over the latter (rightly so) but it *is* still a risk.
My brothers in law are identical twins (so the same facial structure). The clean-shaven one demonstrated the facial unlock feature. When his bearded brother picked it up a few seconds later it instantly unlocked. It clearly does not know that it takes time to grow a beard.
- or do phones unlock with false beards?
"Also you tend to constantly refresh your internal training data when you interact with the people you know"
Yeah, right. I had a beautiful example of that not quite being so. I was once a member of volunteer group. Call it Group A. I was usually present there with SWMBO. There were two other members of Group A who were also members of Group B. I looked a lot like one of these two other members, so much so in fact, that the other guy completely mistook me for his colleague largely, I think, because on that one occasion I didn't have my other half in tow.
I also know for a fact that there are at least a further two people who look very convincingly like me.
SO no, the Mk 1 eyeball is not reliable even with an acquaintance set.
I remember seeing something about people who were so good at it they were able to recognize elderly people based on a photo taken when they were toddlers.
Some talents extraordinary people have like remembering everything that happens to them to the smallest detail for recall years later, or multiplying two huge numbers in their head in an instant, are very easy for computers to replicate. Others like the above are something we may never be able to match no matter how "smart" our software gets.
murdering psychopathic corpse humping gang bangers
I believe this may present an opportunity. Even in areas with large gang problems, I don't believe that corpses are all that readily avaibalable. And surely there's got to be a certain level of lack of attractiveness / freshness in a given corpse past which even the worst PCHGB would recoil.
Therefore all we need is a government honeytrap system of public corpse donation and trapping - and we soon ought to be able to catch the lot of them.
OK there's going to be costs in police overtime for staking-out the trap sites, but that outlay ought to be able to justify itself in savings of police time within barely a few months...
That in itself would create opportunities for monetisation, via selling the footage to some nof the less salburious TV channels.
[Thinks: Should I have posted this anonymously?]
@Nathar Leichoz: What, so as long as you're not part of that collateral damage, you think 96% error rate ok?
Especially in a country where wrongful arrest can leave the arrested dead before even reaching the police station.
Were you born a moronic sociopath, or did it develop in you over time?
Well, according to the fount of all knowledge (Wikipedia), Detriot's population was 10.6% white in 2010 . Given your estimation of accuracy on minorities, then 4% total accuracy implies almost 38% accuracy for whites. I'm guessing either your estimate is high, or more likely, the 4% estimate of total accuracy is high.
Just what did this farcical effort cost the good folk of Detroit? Whatever it was, it's certainly not good value for money - all very similar to our pig-ignorant legal folk here in the UK, who seem intent on rolling out this unreliable invention of the devil that also boats high failure rates. Is it the same kit, I wonder? It's a bad joke, an infringement on privacy and should be banned forthwith.
So facial recognition does work, just not the way they expected. With 96% failure rate it is great at identifying people who should not be arrested. (Joke).
So what next tarot cards? I am sure they can beat 96% accuracy. Perhaps we can turn police investigation into a role playing game and just role dice.
Seriously at 96% the use of facial recognition must be a hindrance to the police.
It's like the sniffer dog, you bring the dog to a car that you want to search, give it a 'tell' so it reacts, and you have an excuse to stop and search because the dog reacted.
A system that flags an insane number of false positives face matches does exactly the same. It works as they want it to work. Generating cover for random stop and search.
Saying you 'do not rely on facial recognition software only', is bullshit. There is no way they are checking every false positive on a system that bad. They are using it to provide the legal excuse to do a 96% random search. And since they do not rely on it, it is 100% effective at that excuse.
Very much hits the nail.
Facial recognition is "just good enough" that it can be used as a smokescreen for a police state.
It basically says 96% of us are suspected ne'er do-wells. The 4% being those who fall into the statistical category of "highly affluent white folk who might be in a position to sue the police if falsely accused".
The 96% are then subject to the individual police officer's prejudices and ulterior motives, and if you happen to be in a category who probably doesn't have the resources to defend themselves adequately, then you can be convicted of whatever the hell they want.
We are all guilty of something.
It is either down to
1) we have not been caught (yet)
2) they haven't invented a crime for what we did. When they do, expect them to come down on us like a ton of bricks.
There has to be a constant stream of new laws for us to break otherwise all those Lawyers and their bretherin who go on to become Lawmakers (aka Politicians) would soon be out of a job.
Beware the future when even getting out of bed in the morning will require 1) a conversation with your lawyer and 2) a 300 page risk assesment. Tripping over a duvet can potentially be fatal and your people will need to know who to sue for damages if you did just that.
Yes, it is Monday morning and I'm feeling particularly grumpy today.
And yet you continue to use it.
Would you continue to use a gun that misfired 96% of the time ?
Would you continue to use a car that didn't start 96% of the time ?
Would you continue to use a phone that couldn't make a call 96% of the time ?
What this guy is saying is that the cops poured a truckload of money into that piece of shit software, and they will use until it works and the innocent be damned.
This is what you get when you combine a monumental IT failure with a bunch of bone-headed officers of the law. A running risk for innocent people. Well done, Land of the Free ! Well done.
"How accurate is this technology compared to the Mk1 eyeball?"
My comments are based on similar tech used elsewhere:
It's rubbish but it is likely designed to reduce the load on the eyeball.
Rather than have the human ask "do any of the thousands of faces that'll walk down this street match any of the several hundred people we've got with warrants out for their arrest" it'll have a stab at it and flag close-ish matches for a human to compare.
The human then compares the two faces.
Essentially, it should be viewed as a "this face is not similar to the ones on your list" machine.
So 24 in 25 flagged matches are wrong. So what? Without knowing the false negative rate, we still don't know enough to tell whether it's useful.
Hypothetically, if there are *zero* false negatives, this would still be a very useful system. If you have one suspect to identify in a crowd of 1000 faces, it's entirely worthwhile to have a computer just show you 25 faces to take a closer look at, rather than the full 1000.
Of course I know it won't be that accurate, but without knowing *how* accurate it is this "96%" figure *still* isn't enough to pronounce it useless.
And yes, I realise it will also victimise people based on skin colour - but let's be clear, that's an entirely separate issue, the cops don't need any automated help doing that anyway.
@veti You're OK with 24 innocent people being surrounded by armed police on the basis they were wrongly flagged as a terrorist suspect (NB. SUSPECT), dragged off to a cell, denied civil liberties because 'terrorism', their houses ransacked, families and friends spied on (three hops = Kevin Bacon) their finances gone over, their computer equipment and phones taken and not returned, their livelihoods likely taken away from them, and even after they have been found out to be not the SUSPECT that was being sought, they are still on the no fly list?
All so the cops can finger a single suspect who is not only presumed innocent but, given the police's ineptitude, probably is innocent.
Yes, you get it.
We need clearer reporting on this.
We also need sensible legislation to prevent the collection, storing and sharing of data that would enable automated identifying and tracking of people using CCTV.
We're not there yet but it would be great to have the discussion and legislation done early rather than late.
The issue is such god damn awful and inept systems WILL continue to be used for long.
The reason is there won't statistics behind people ruined life over those solutions.
The newspapers will continue to report X people arrested and all that and never the 96% ruined lives.
PS: really good job on achieving 96 % of fail. Keep going ;) You're doing wonders !
This system is a defense lawyers dream. I can see it now in defense cross-examination of a police witness.
Defense "So was the Facial Recognition used to identify my Client?"
Police Witness "Yes"
Defense "That's the system that 96% of the time identifies the wrong person?"
Defense "No need to answer that it is a public record that 96% of the time it is wrong."
Defense "So there is a 96% chance my client is not the perpetrator of the crime?"
Way to introduce doubt in the minds of the jury.
Biting the hand that feeds IT © 1998–2021