LiAr LiAr PD
'But we get results, if we have got your photo, you must have done something wrong.'
What is it about cops that makes them think they can ignore all of the expert opinion, use any means and then try to justify the end?
The Los Angeles Police Department has run facial recognition algorithms a whopping 29,817 times over a decade in an attempt to identify suspected criminals captured in CCTV footage, despite promising it wouldn't. Officers used software built by DataWorks Plus, the same biometrics company whose technology led to two wrongful …
Law enforcement says it all. Again we are entrusting humans to accept to enforce authority. I've had my share of run-ins and assistance with law enforcers of many levels, and it boils down to people. I've answered questions from the Secret Service when a former first lady visited our organization, and I've had the flash light shined in my eyes at 1:00am in Podunk county. Some people do their jobs, others make it a crusade. It's hard to fight righteousness.
Not everyone. We signed up one of our children for little league soccer last year and of course the league requires that all parents download the 'free' app that they use for communication.
This particular app was provided 'for free' by Dick's Sporting Goods, a fairly large chain in the USA. In their lengthy, draconian Terms of Use and 'Privacy' Policy they did in fact state that they reserved the rights to use any and all information they collect, and combine it with all the information they collect when you visit their stores.
This included purchase information and images from security cameras.
It's all in the tone which gets lost when the statement is transcribed. The operative word in their statement is 'We', the other words are diversions.
"We actually do not use facial recognition in the department," an LAPD spokesperson previously told the LA Times last year.
Can we hoped that these things have improved over the years, so we don't have deal with things like the pig-farm site that was blocked (several years ago, but who _really_ wants to argue the "Software only ever improves" side of that debate) because "a lot of pink" _obviously_ can only be nude people, ergo Porn!
(Or was that really an excuse to delete a PETA page?)
The larger the pool the potential matches, the worse it does. It is great for allowing my phone to identify I'm me, but if my phone had 100,000 faces saved in it would never be able to reliably which 'user' was unlocking it. That's why businesses that use facial recognition for access control use it along with another method (like a smart card) because it would never work by itself for a business of any size.
Even ignoring racial bias, trying to match an unknown face from a large pool of people (like say everyone the LAPD has arrested since they started taking digital mug shots) is a fool's errand. The odds of a false match go up the larger the pool you are checking from. They'll get a bunch of matches, with at best all but one false match. It only works if they have a small pool to check from, i.e. after they've already done actual police work and narrowed the list to a handful of suspects. Of course, in that case, they could use their own eyes and wouldn't need facial recognition software.
Elvis isn't dead; he works at a mini mart.
"It only works if they have a small pool to check from, i.e. after they've already done actual police work and narrowed the list to a handful of suspects."
But then you may as well put some humans on it. Facial recognition is a useful tool for going through tens or hundreds of thousands of pictures and reducing that down to a pool of a few percent of the original collection. But then the police work needs to start. Police have demonstrated a lack of understanding of scientific methods, including the statistics surrounding error rates. Even fingerprint identification has errors associated with it. But I doubt you could get a print analyst to cite the statistics surrounding that. Particularly not in a court of law, where the analysis is always put forth as flawless. Facial recognition might have an error rate an order of magnitude higher than that for prints.
Once the cops understand error rates, they can stop tackling people in a crowd based on automated recognition and get down to using it as a first slice at generating a list of suspects.
That may well work in the UK, but in the US it would be a disaster. Too many cops treat a "suspect" like a criminal, and it would only be a matter of time before an innocent person fingered by facial recognition was killed by a cop who treats an arrest like an opportunity to bust some heads (or kneel on some necks)
Today, I probably would get shot.
Don't be silly. America's police are very professional. They would not shoot you unless you gave them a reason to.
They would probably arrest you, put cuffs on, a bag over your head, shackle you to the back of their vehicle and zip-tie your legs -- and only then shoot you for attempting to escape.
This post has been deleted by its author
This post has been deleted by its author
I assume this will be as effective as the AI algorithms they use to identify content from "untrusted sources" about COVID-19. The algorithm was intended to prevent sites offering content with inaccurate health information, but ended up flagging tech site stories talking about supply chain problems caused by COVID-19. In typical Google fashion, it was then almost impossible to contact a real person to get this fixed.
The Los Angeles PD has a history of dirty tricks - even more so these days that Federal Funding has increased. They have a well-earned reputation for abuse - think Rodney King. Since 2000, there have been nearly 900 killings by local police that were ruled a homicide by county medical examiners. Since 2000, only two officers have been charged for shooting a civilian while on duty.
Facial recognition is a feature of LAPD smartphones whereby a cop can take a photograph of a person and within a minute or two produce potential 'hits'.(They also use other devices to take fingerprints with attachments to their smartphones.)
Now companies such as Microsoft, with their IoT suite, are gearing up to get their hands on the Federal pot of gold.
Earlier this year I spent time with photographers who are known as First Amendment Auditors, or Cop Watchers, and our project was to disrupt police technology. No one for feels more isolated than a cop without his electronics. We achieved many successful projects that effectively neutralise, or interfere with, police electronics.
A 'hunting jacket' filled with electronics can effectively kill, or modify, in a radius of 200 metres.
The ill-will enjoyed by LAPD, and many other US police departments, is largely because of the disconnect between themselves and the public they serve. Their 'antics' can be seen on YouTube (just search 'First Amendment' and a destinations).
How would Brits react if surplus Army vehicles were employed to patrol Princes Risborough, Castle Combe or Grasmere?
"How would Brits react if surplus Army vehicles were employed to patrol Princes Risborough, Castle Combe or Grasmere?"
I'm not sure myself, probably best to ask someone from Northern Ireland. And I think the Snatch Land rovers went the other way, from Police to Military vehicle.
Compared to LASD the LAPD are choir boys - which is why the LAPD outsourced it to them
It almost got shut down when it was discovered that most of offices were members of right wing gangs being run out of police stations
I've said it before, but focusing so hard on the racial bias problem is walking into a trap. They only have to claim that "this one doesn't have that problem," and your argument is severely weakened. At least until it's shown that it does discriminate.
The more general issue is that the (mostly, kinda sorta, we're all fallible) law abiding population must not be kept under constant surveillance. Insert obligatory Stasi reference here.
Protest as much as you like facial and other types of computer recognition are coming whether you like it or not.
Now when it drives justice efficiently and without prejudice people will mostly welcome it. They will be the modern equivalent of speed cameras, people argue against them but mainly fail because they do a good job. If we are going to police the modern world with the investment we are prepared to put into the Police we need these tools.
Currently training databases are insufficient but soon they will be good enough to deploy.
As I type this the coputer corracts my spalling and grimmer. 30 years ago that was considered scandalous, it is standard now.
Except that it doesn't do anything of the sort, nor can it. Read up on the birthday paradox. It's why even an improbably good biometric ID system must never be used to ask "are any of these bad people in any of those videos?" It's just for "does this suspect match evidence from that crime scene?"
My computer is trying to tell my that I've spelled biometric wrong. I know better, but try telling a criminal court that their computer is incorrect.
OK lets try an experiment put 999 people in a room with Brad Pitt or Angelina Jolie (pick your celeb it can be an airhead with a big bum if you prefer) I bet the majority will correctly recognise the celebrity from the others. Facial recognition works between humans, super recognisers can do this with a list of thousands of suspects, in a few years many computers will manage it for a near unlimited list. The Birthday paradox is looking at one point, facial recognition works on thousands of points.
There are rules and we partially understand them presently. What I was trying to show that we thought spelling and grammar checkers were considered too difficult decades ago I remember using early versions of word & wordperfect. Now they are completely normal. Facial recognition is on the way. Deny it all you like, it doesn't care, it will appear without you.
When it works then it will revolutionise policing and uncover misbehaviour on the police side. How nice will it be if the Police only stop people that actual verifiable intelligence suggests need stopping and searching. It will mean crime will be solved quickly & efficiently if we support its evolution.
Computers checking spelling was never a general scandal. You could undoubtedly find people who thought that, if you needed it, you were stupid then. You can almost certainly find such people now as well. If I traveled back in time to 1990 and denounced you as a terrible user of a spell checker, nobody would care. Facial recognition isn't a spell checker. A spell checker might mean that people don't learn to spell as well. A facial recognition system means lots of false arrests and boundless opportunities for abuse.
Actually when it first started out it was in American and some of the spellings of technical words were missing or questionable so we had to screen the spelling correction just as we do autocorrect on texting nowadays. That was the reality. Now I rarely find the spelling wrong when I feel the need to check.
We are at the point that we need to review recognised images before use them, let us encourage that rather than saying they are currently inaccurate and we will never need to use them!
A working facial recognition system will reduce wrongful arrests and criminals staying at large. You see we want the same thing.
No, we do not agree. A working facial recognition system, if it's even possible, would be a nightmare. Imagine what a totalitarian country would do with something like that. Imagine what a malicious operator in a democratic country could do with something like that. It could be awful. What it does is provide a mechanism to track a person wherever they go without alerting them and by providing a smokescreen of a potentially useful purpose.
Lots of things would catch criminals faster and reduce crime rates. Some of those things should be tried. Some of those things need to be avoided, even with the extra crime, to avoid creating a terrible situation for the innocent. Systems which destroy privacy or give the police unchecked power are among those types. A working facial recognition system, for that matter even one which doesn't work, does both.