Repeat after me: Orwell's 1984 was a warning, not a how to guide.
Metropolitan Police's facial recognition tech not only crap, but also of dubious legality – report
Facial recognition technology trialled by the Metropolitan Police is highly inaccurate and its deployment is likely to be found "unlawful" if challenged in court, an excoriating independent report has found. Researchers from the Human Rights, Big Data & Technology Project, based at the University of Essex Human Rights Centre, …
COMMENTS
-
-
-
Tuesday 9th July 2019 02:25 GMT Danny 2
Re: Unfortunately....
Katherine Helmond died this year (star of Brazil and Soap).
Brazil was a brilliant movie that lost a lot of power when Jean Charles da Menezes was shot dead for the crime of looking Arabic aka brown.
When real life becomes worse than satire then satire is deflated. Accidental Death of a Brazilian.
[Edit: Oh, and I posted this before I read the thread and saw other people name checking him. Poor guy, could have been any of us]
-
-
Thursday 4th July 2019 11:29 GMT batfink
Help with "Innovative Solutions"
I'm sure the Chinese Government could help the Met Police with a few"Innovative Solutions for making London safer", based on their own experience in some of their Regions.
Whether those living in London might actually be happy with the "Solutions" is another question entirely.
-
-
-
Saturday 6th July 2019 22:33 GMT doublelayer
Re: Help with "Innovative Solutions"
You may be happy for the Chinese to use your data, but maybe you'll change your mind when you figure out that they can use your data to help improve the technology they use to commit massive human rights abuses on someone else. Consider this (audio), for example. That's what they can use data for, and it can come here once they've perfected it and on the way used it to imprison and kill thousands and eventually millions of innocent people. Are you still fine with it?
-
-
-
-
Thursday 4th July 2019 15:11 GMT phuzz
Re: Help with "Innovative Solutions"
Assuming you count all the corner shops with a twenty year old, black and white, camera tucked away behind the crisps, recording onto a 30min VHS tape that snapped two years ago but no-one has noticed yet.
The UK government wishes they had as many working, networked, CCTV cameras as they said they did.
-
-
-
Thursday 4th July 2019 22:33 GMT Nick Kew
Re: Help with "Innovative Solutions"
An innocent Brazilian in a world where police didn't have facial recognition. Surely he of all people would've stood to benefit from any alternative technologies they might have had, that could've caused them to act differently (no matter *what* difference) on that day.
Surely what matters with such cameras is what they do with the information. Isn't the most likely usage (for the foreseeable future) to alert a human to look at such-and-such?
-
Thursday 4th July 2019 22:41 GMT BristolBachelor
Re: Help with "Innovative Solutions"
"Surely he of all people would've stood to benefit from any alternative technologies they might have had"
My reading is that with this technology, they would've shot 42 people instead of only 1, but only 8 of the shot people would've actually been of interest. (Actually shooting anyone is another topic)
-
-
Friday 5th July 2019 07:05 GMT Rich 11
Re: Help with "Innovative Solutions"
How many even knew that they'd been "identified"?
Presumably they knew they'd been identified by some means when a copper came up to them and said, "Mr X? I'd like to speak to you about your outstanding fine / latest heist / Great Escape." But for whatever reason they still weren't worth arresting.
-
Friday 5th July 2019 08:07 GMT Prst. V.Jeltz
Re: Help with "Innovative Solutions"
My reading is that with this technology, they would've shot 42 people instead of only 1, but only 8 of the shot people would've actually been of interest. (Actually shooting anyone is another topic)
No , no they wouldnt be shot . Everbody gets up in arms about this because they think the police are going to release ED209 into a crowd to gun down whatever it wants. I would suggest the system show a copper two photos with the caption I think this guy is this guy and then if the copper agrees - they take some action. If the system scanned 10,000 face at Notting Hill and made 42 suggestions of which 8 were correct thats pretty fucking good going I dont think a copper stood watching the crowd on his own would get 8 results.
Presumably they knew they'd been identified by some means when a copper came up to them and said, "Mr X? I'd like to speak to you about your outstanding fine / latest heist / Great Escape." But for whatever reason they still weren't worth arresting.
Again , no. They wouldnt know theyd been scanned because the copper would have veto'd the machine if he didnt think the 2 pictures were the same guy.
If he did , and it wasnt , then fair enough Inocent guy looks near identical to a wanted crim - what can you do? you need to see his I.D
-
-
Friday 5th July 2019 08:58 GMT Anonymous Coward
Re: Help with "Innovative Solutions"
"If he did , and it wasnt , then fair enough Inocent guy looks near identical to a wanted crim - what can you do? you need to see his I.D"
What is this I.D of which you speak? I am not aware that I am required to have any or, indeed, to carry it with me.
-
Friday 5th July 2019 09:12 GMT SloppyJesse
Re: Help with "Innovative Solutions"
" If the system scanned 10,000 face at Notting Hill and made 42 suggestions of which 8 were correct thats pretty fucking good going I dont think a copper stood watching the crowd on his own would get 8 results."
And there is the exact reason this is not the way to test the effectiveness of this technology.
We do not know how many valid targets there were in the population checked.
What they should be doing is recruiting a bunch of volunteers, putting them (and only them) into the system and then sending them into a crowd. Then we'd be getting sensible information to judge effectiveness.
-
Friday 5th July 2019 11:23 GMT katrinab
Re: Help with "Innovative Solutions"
My understanding is that it made 42 suggestions of which 0 were correct.
The facial recognition system has only once in recorded history correctly identified someone on the database, and that person shouldn't have been on the database as they were no longer of interest to the police.
-
-
Friday 5th July 2019 09:25 GMT ibmalone
Re: Help with "Innovative Solutions"
Or you have the case of Steve Talley https://web.archive.org/web/20190518102457/www.copblock.org/152823/denver-police-fck-up-again/ https://denver.cbslocal.com/2016/09/15/former-financial-advisor-wrongly-accused-of-bank-robbery-fights-to-win-life-back/
In short, he was arrested twice, once with extreme force by a SWAT team, and spent months in prison. But the computer said it was him, so tough luck.
In the case of Charles de Menezes the police seem to have been operating under such a state of hysteria that you can quite easily see a dodgy facial recognition match leading to a shooting, it's not at all far off what actually happened.
-
Friday 5th July 2019 17:29 GMT Woodnag
Re: Help with "Innovative Solutions"
It's not just that Mr Menezes was murdered that's the problem, but the lies about the circumstances to make the guy appear suspicious that were instantly shovelled out and regurgitated by the press.
The Met really doesn't suffer from much accountability. The UK lost "S and Marper v United Kingdom" 11 years ago and still haven't deleted that illegal DNA database.
-
-
Friday 5th July 2019 12:23 GMT Loyal Commenter
Re: Help with "Innovative Solutions"
Everbody gets up in arms about this because they think the police are going to release ED209 into a crowd to gun down whatever it wants.
I think you may have a bit of a straw-man argument there. The actual impact of falsely identifying someone as a suspect, especially if it happens repeatedly, should be obvious. How would you like it, if you happen to be on your way home from work, flagged up as a suspect and arrested. Sure, you'll get released again. It's not exactly convenient for you for this to happen though, is it.
For a case in point, why not ask the guy who made the news in Bristol, who was tasered outside his home, because some particularly overzealous plods from Avon and Somerset Constabulary thought he was a wanted suspect. Particularly embarrassing for them, since the guy in question was actually a community police liaison bod for the black community in St Pauls, which kind of begs the question, "did you just see a black guy with dreadlocks and think, 'We're looking for a black guy with dreads', and nab him?"
And that's without havign a computer make the wrong decision for you.
-
-
-
-
Friday 5th July 2019 05:42 GMT really_adf
Re: Help with "Innovative Solutions"
"Surely what matters with such cameras is what they do with the information."
Absolutely, but in general people seem to trust what computers say more than I think they should.
Yes, facial recognition may have prevented the tragedy in Stockwell, but the concern due to the above is how to ensure it doesn't end up causing more such tragedies because "computer says he's armed and dangerous".
Unfortunately, I fear the answer will come too late for some, but research like that reported here offers some hope that fear will not be realised.
-
-
-
-
-
Friday 5th July 2019 06:15 GMT Fruit and Nutcase
Re: Help with "Innovative Solutions"
The UK are the world leaders in CCTV deployment by a very wide margin
The introduction of "Smart Motorways" where they repurpose the "hard shoulder" (emergency lane) as another traffic lane and monitor the road with continuous CCTV coverage for stranded vehicles means for long stretches of road (12 miles, spanning several junctions in the case of the M3) you are under constant surveillance. Couple that with ANPR and real-time mobile network mast meta data...
-
-
Thursday 4th July 2019 11:29 GMT Anonymous Coward
It's in its infancy, but it will improve
The knuckle-dragging Luddites always get in a froth when new technology is applied.
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
* You are happy to be tracked RIGHT NOW (advertising)
If you are happy with all of those (and it seems you are given the up-take), then why are you getting your gusset in such a twist of over the Met applying technology to public safety? Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on.
If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers.
The technology still needs to advance, but once it have it will be a MASSIVE BOON to society by helping us to identify risk individuals before they have would have become known by traditional means. More importantly, it will help prevent the police from wasting their time and innocent people who happen to be "the wrong colour".
Get your heads out of your collective bum.
-
Thursday 4th July 2019 11:49 GMT smudge
Re: It's in its infancy, but it will improve
Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on.
I doubt if the family of Jean Charles de Menezes will share your confidence.
But even if you don't get shot, you could certainly get into trouble that would take a lot longer than 30 seconds to get out of.
If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies
It will make eff all difference to stop and search. You don't have to be on a watchlist to be stopped and searched. The police officer merely has to have "reasonable grounds" to suspect that you are carrying something dodgy. AI facial recognition will not affect that in the slightest.
Icon of BB to give you a thrill.
-
Thursday 4th July 2019 17:56 GMT Scroticus Canis
Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.
The failing there was the good old Mark One Eyeball of the current Met. Commissioner, who was the "Gold Commander" on that botched op., and her underlings. Sweet FA to do with automatic facial recognition.
So your point mentioning him was?
-
-
Friday 5th July 2019 02:02 GMT veti
Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.
Menezes' death was a tragedy, but that was fourteen years ago. There are people old enough to vote today, who are too young even to remember that story. If you can't come up with some more contemporary examples than that, you should consider the possibility that perhaps you really are making a lot of fuss about nothing.
In a bad year (such as 2016 and 2017), police in the whole of the UK may kill as many as six people, including deaths in custody. That's far fewer, per population, than France, Germany, Italy, Australia or Canada, and don't even ask about the US. I know it may not feel like it, but the facts speak for themselves - the UK (still) has one of the most civilised police cultures in the world.
-
Friday 5th July 2019 12:23 GMT Robert Carnegie
Re: Jean Charles de Menezes wasn't a victim of facial recognition cameras.
British police have indeed killed lots of people since Menezes, usually by trying to. Usually, either they put across the story that it was unavoidable to kill the suspect, or they were black or on drugs or mentally ill and so there isn't much of a fuss.
https://www.inquest.org.uk/deaths-in-police-custody if I'm counting right is showing about 1 death in British police custody per week since 1990. That evidently does include the Westminster Bridge terrorists who it's difficult to dispute had it coming, but I think also it's about the rate of deaths at the hands of an abusive partner or a mentally ill person, quite roughly, which are considered to be undesirably many. Mind you, if your partner is a mentally ill police officer and does you in then you'll be counted as all of those.
-
-
Friday 5th July 2019 18:37 GMT Scroticus Canis
Wooosh - you miss the point - there was no automatic facial recognition in use ...
... so why is this killing relevant to the the use of said equipment?
"At around 9:30am, officers carrying out surveillance saw Menezes emerge from the communal entrance of the block..." according to Wikipedia.
The eyeballing operative had CCTV prints of the suspects and thought he might be of interest. However he had his dick in his hands at the time and thus could not film de Menzes to send images to another Dick who was the Gold Commander of the operation. Thus it's the dicks which caused the evolving cock-up to go lethal.
So why link it to a technology which wasn't even in use at the time? Or do you think ancient CCTV is automatic facial recognition?
-
-
-
-
Thursday 4th July 2019 12:03 GMT Mr Dogshit
Re: It's in its infancy, but it will improve
* You have a personal tracing device in your pocket RIGHT NOW (your phone)
No doubt my Doro can be traced to the nearest mast, but it doesn't phone home to Google every 10 seconds with GPS co-ordinates.
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
No I don't.
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
No I don't.
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
No I'm not.
* You are happy to be tracked RIGHT NOW (advertising)
No I'm not.
-
-
-
Friday 5th July 2019 08:21 GMT Richtea
Re: It's in its infancy, but it will improve
> I have smart phone, GPS and wifi is always off until I actually need to use it.
You do realise that it's not beyond the wit of Google to enable location without your permission, right?
https://crisisresponse.google/emergencylocationservice/how-it-works/
It fires an SMS with your location to the emergency services. You won't find the location SMS that was sent in your outbox, it's supressed. Nice and silent.
In this case it's definitely for 'the greater good', Sargeant Angel, but there's no opt-in or opt-out.
The feature is driven by on-device logic built into Play Services, but it would be very little effort to target an individual device with one more flag: 'track this user on any interaction and send location SMS to emergency service 5').
-
Monday 8th July 2019 12:53 GMT Anonymous Coward
Re: It's in its infancy, but it will improve
"GPS and wifi is always off until I actually need to use it. Saves battery as well."
(ignoring the fact that they can track you from the cellphone masts)
BBC Weather is starting to piss me off as it has just started to prompt for 'track your location' despite it have been working perfectly well for years (and across 5 devices) with location turned off and just 'Heathrow' as a favourite location
-
Saturday 6th July 2019 22:50 GMT doublelayer
Re: It's in its infancy, but it will improve
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
With as much tracking turned off as I can, and if I was worried that people were actively tracking me with it, I'd leave it at home.
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
None of those. A few things have microphones and internet connections but I've set them up and know what they're doing. If I was worried that people were actively tracking me with them, I'd disconnect either the microphone or the connection.
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
None of those at the moment, but I once had an activity tracker that I gave away because I didn't use it. It monitored my heart rate during exercise, and could send it to my phone but I never enabled that. So it was a tracker whose tracking data only went to me, and it lacked the technical ability to report on me. If I was worried that people were actively tracking me with it, somehow circumventing the limitations of the device making this impossible, I'd leave it behind.
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
None of those. I prefer passwords to log into my computer, and no Facebook account. If I did use a facial recognition system, I'd do so in such a way that the recognition was done using local processing on local data only.
* You are happy to be tracked RIGHT NOW (advertising)
I am not happy. That's why I have ad blockers, tracker blockers, and a DNS filter. Even that is tracking for economic purposes, not complete surveillance, so is not as bad an abuse as what has been considered (and done already) by governments.
-
-
Thursday 4th July 2019 12:13 GMT Neil Barnes
Re: It's in its infancy, but it will improve
What Mr Dogshit said, in spades.
And the secondary points: as a matter of law, should the police be allowed to keep my images, movements, fingerprints, DNA etc. if they don't arrest, charge, and find me guilty? I don't think so; somewhere in the deep and dusty corners of UK law there is the presumption of innocent until proven guilty.
This kind of thing is basically saying "Hey Mr Citizen, you're a criminal. We're just waiting to find out what the crime is." My sympathies are entirely with that chap who was arrested for covering his face; I would have done the same.
I'm not sure which I dread more: a facial recognition system with a massive error rate, or one that's a hundred percent accurate...
-
Thursday 4th July 2019 14:24 GMT John Brown (no body)
Re: It's in its infancy, but it will improve
"somewhere in the deep and dusty corners of UK law there is the presumption of innocent until proven guilty."
I'm sure you know this and it was just a brain-fart, but the operative word is "unless", not "until". "Until" presumes guilt, they just don't what of, yet.
-
Friday 5th July 2019 03:42 GMT DiViDeD
Re: It's in its infancy, but it will improve
"* You have a personal tracing device in your pocket RIGHT NOW (your phone)
No, no you don't. the GPS receiver in you phone knows where you are (the clue's in the word 'receiver'). It doesn't tell anyone else where you are unless you've foolishly set it up to broadcast your position. Why do you think trans Pacific flights disappear from ground station view even though those on board know exactly where they are?
Your mobile service will know you are within the range of mobile tower X, if anyone bothers to go check the logs, but nobody knows where you are to the metre apart from you. Unless, as mentioned, you've decided to broadcast your position to all and sundry.
EDIT: Could someone please tell El Reg's spillchucker that that is, indeed, how you spell 'metre'?
-
Friday 5th July 2019 09:49 GMT Portent
Re: It's in its infancy, but it will improve
Yes you do have a tracking device in your pocket. Google has previously been found to track you based on mapping all the wifi routers in your area. Based on the strength of each signal it is able to position you surprisingly accurately. That's why, whenever you turn on GPS on an Android phone, it asks if it can track wifi to make it 'more accurate'.
-
-
-
-
Thursday 4th July 2019 12:14 GMT Lee D
Re: It's in its infancy, but it will improve
The "AI" (pfft) has proven itself to be far more biased and has much more trouble picking out features on less-contrasting skin tones (i.e. darker ones with no lighter features, as opposed to lighter one which universally have darker features in places).
-
-
Thursday 4th July 2019 12:17 GMT Cynic_999
Re: It's in its infancy, but it will improve
I am aware of the way that commercial organizations use technology and track private citizens, and I am far from happy with the way that has developed, and I doubt that anyone else who understands what's going on is happy either. But the threat from commercial exploitation where the motive is profit is nothing compared to the damage a state actor can do to people through its use or misuse of the technology. The government's chief motive, no matter what rhetoric it is using to justify what it's doing, is to control the population to make us collectively behave in a way that is beneficial to those in power.
I recall when CCTVs were first being introduced. There were many articles about how it would make us all safer etc. The citizens of one large village thought it sounded like a great idea and petitioned to get CCTV installed ASAP to stop the small but annoying amount of graffiti and vandalism. Within days of the CCTV being installed it was successfully cutting down on crime - the local pub landlord was successfully prosecuted for allowing the locals to stay too long after closing time, and the local parking wardens were increasing the number of fines 10 fold by using the CCTV to look for illegally parked vehicles. Not exactly the sort of crime reduction the people had in mind. And the graffiti and vandalism? They remained unchanged - the police said the CCTV was ineffective as the culprits covered their faces and didn't stay around long enough to be caught in the act.
-
Thursday 4th July 2019 12:27 GMT Anonymous Coward
Re: It's in its infancy, but it will improve
Not all of us are tracked in the way you suggest. My smartphone has data and location services turned off most of the time, and the cell phone tower data is not available to the police without a warrant.
I don't use Facebook, Facetime, or any video messaging service (although I have a mostly unconfigured Facebook login)
I don't have an Amazon, Google, Smart TV or any other voice assistant device in my home, and I resist having IoT devices as well.
I can't do much about advertising tracking, I admit, but that is fairly minor, especially when I say "No" to sharing location information when I visit web sites. I regularly clear my cookie cache.
If the facial recognition was a "compute hash, check hash against watchlist, delete hash if not on watchlist", then I would reluctantly support this technology. But it won't be. As we've seen from several reports about fingerprint and DNA data (and I would also expect ANPR info and congestion/ULEZ info), the Police are reluctant to discard data even when they are legally obliged to do so "just in case it proves useful later", and I suspect they may want to positively identify everybody that comes into the field of view regardless of whether they are on any watchlist.
I'm dreading the town centre and in-shop CCTV footage being automatically scanned by machine, because it will seriously undermine our rights, and it's not so far fetched with some of the cloud services available off-the-shelf now.
You only have to listen the many, many broadcast interviews with members of the police to realize that the police regard everybody as suspects.
-
Thursday 4th July 2019 17:34 GMT Anonymous Coward
Re: It's in its infancy, but it will improve
The old (and quoted from police spokespeople here multiple times in the past) "There are those are staying just on the right side of law that we are very keen to do something about and that <insert group> are demanding action over"
Eseentially we don't like those who challenge our authority. AC because Police Scotland smashed in the door in a 7 man armed raid of the last person who challenged their authority
-
Friday 5th July 2019 08:25 GMT Richtea
Re: It's in its infancy, but it will improve
> My smartphone has data and location services turned off most of the time, and the cell phone tower data is not available to the police without a warrant.
Not true in the UK, in 'special' cases. You ring on Android, they know your location - and you didn't opt in:
https://crisisresponse.google/emergencylocationservice/how-it-works/
-
-
Thursday 4th July 2019 13:07 GMT Captain Hogwash
Re: It's in its infancy, but it will improve
The technology still needs to advance, but once it have [sic] it will be a MASSIVE BOON to
societythe party by helping us to identifyriskdissident individuals before they have would have become known by traditional means. More importantly, it will help prevent the police from wasting their time and innocent people who happen to be no longer voting for "the wrong colour". -
Thursday 4th July 2019 14:14 GMT jmch
Re: It's in its infancy, but it will improve
Dear AC.
I score 1/5 on your list (just the phone). Location tracking and history is turned off (although I am aware that Google could be saying it's off and collecting it anyway). I am also equally aware that my mobile service provider tracks me from mast to mast, and would still do so if I had a dumb phone rather than a smart phone. That data is secured and timebarred by legal provisions backed up by massive possible fines for misuse or loss due to botched security. Nothing's 100% secure, but it's secure enough for me to balance against the convenience of being able to make and receive calls at any time and have a super-powerful portable nanocomputer always available.
Face recognition by the met fails on so many counts, to begin with the "trawl vs index search". If they are looking for a single individual, and search their archives for matches of that individual. If they're trying to match every feed they have with every known person in their database, it's a trawling expedition. The first MIGHT be OK if they have a warrant, the second is a big no-no. That's before arriving at the actual legal basis to scan real-time and/or store this data (of which there is none), and the technical capability of the system to accurately match people (which seems to be equivalent to that of Mr Magoo)
-
Thursday 4th July 2019 16:50 GMT Anonymous Coward
Re: It's in its infancy, but it will improve
"* You have a personal tracing device in your pocket RIGHT NOW (your phone)."
Running Symbian and off, so ... technically yes, but to all intents and purposes, no.
"* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)"
No (well, I have a smart TV, but it can't talk to the internet, due to not being connected to anything)
"* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)"
I do not
"* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)"
Swing-and-a-miss
"* You are happy to be tracked RIGHT NOW (advertising)"
uBlock Matrix + NoScript says probably not.
"Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on."
Or, more liklely, "computer says you're a criminal, the ID's probably fake, come with us while we get it checked" (see how they handle photographers in London*). That will appear on the enhanced disclosure in the DBS process.
If you don't wish to search, try:
https://www.theguardian.com/uk/2010/may/10/stop-search-photographer-grant-smith
https://www.theguardian.com/uk/2009/dec/08/police-search-photographer-terrorism-powers
"If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers."
What do you think an AI trained on an imperfect, imbalanced data set will behave like. If you need a clue, have a look at other articles on here ...
"Get your heads out of your collective bum."
Maybe someone needs to take their own advice ...
-
Thursday 4th July 2019 19:19 GMT Doctor Syntax
Re: It's in its infancy, but it will improve
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
Only when I remember to take it with me. And I minimise what's on it.
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
You might have them. I don't and won't. I can't imagine anything they'd be useful for. TV smarts are provided my MythTV and Kodi. I control them, not the other way around.
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
Nope. Again, can't imagine having a use for them. As to internet connected fridge - ROFLMAO.
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
Facebook? Absolutely not. Windows, Apple? No, Linux..
* You are happy to be tracked RIGHT NOW (advertising)
Poor little A/C. Never heard of ad blockers, NoScript and all the rest of the armour the security minded use.
Apart from any other consideration has it not occurred to you that one of the requirements to live freely under the law is that the police should follow the law themselves? When it's considered likely that a challenge to their legality would likely be successful then we really should be concerned.
-
Thursday 4th July 2019 21:37 GMT Anonymous Coward
Re: It's in its infancy, but it will improve
"Get your heads out of your collective bum."
I think that is where your head has been if you are totally unaware of the flaws with the current systems and the inability of the system to deal with the 'Wrong Colour' problem by simply matching *anyone* of Colour with *any* random 'person of colour' on file. (Not sure that is an improvement on the 'Stop & Search' we have now !!!)
It is not good enough to say 'when the Technology advances' it will solve all our problems, as so many of the great Technological leaps never happen because the Technology is not as good as thought and the problem is a 'little' bit more difficult to solve than admitted !!!
It will be all right on the night does not work with technology ........ ever !!!
-
Thursday 4th July 2019 23:58 GMT Cpt Blue Bear
Re: It's in its infancy, but it will improve
I fear, Mr Coward, you have completely missed your own point. You may have made that point accidentally and be completely unaware, mind.
In my experience the people who tick your boxes are unaware of your points. Those of us who are avoid or mitigate their effect.
Those points have been dealt with by other posters.
What they haven't addressed is your "MASSIVE BOON" (initially mistyped as BOOB - make of that what you will). You seem to be a fan of arresting people based on "risk" rather than their actions. Welcome to the world of thought crime, guilt by association arbitrary arrest. When you start arresting people for what they might do rather than what they have done you have well and truly left any notion of justice far behind.
I also note you say "us". Clearly, you don't ever envisage being on the receiving end of this. That is telling and makes me wonder exactly who should be removing their head from their posterior...
-
Friday 5th July 2019 10:24 GMT Jimmy2Cows
Re: It's in its infancy, but it will improve
Hmm let's go through your list...
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
Wifi off. Data off. GPS off. Sure it's connected to a nearby mast but so is every mobile phone, being, you know, a basic requirement to work as a phone.
Try again...
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
My TV doesn't have a microphone. Don't see the point of digital 'assistants'. Games consoles, yeah but they're off and I choose to log into its network services. I don't have to.
Try again...
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
Wrong on all counts. Try again...
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
Let's see... don't use Facebook or any of that social media nonsense. Don't have any Apple products. PC camera is off unless needed for video conference. Plus the fact that facial *detection* is not the same as facial *recognition*.
Try again...
* You are happy to be tracked RIGHT NOW (advertising)
I'm certainly not happy about it, however it's impossible to do my job without using a PC, and impossible to buy anything (physically or online) without accepting that I'm probably being tracked. So I tolerate it as an inescapable evil. I don't have to like it.
Zero out of five.
As to get in a froth when new technology is applied it's not that it is being applied, it is *how* it's being applied - questionable training sets, questionable accuracy, questionable retention policies, questionable legality.
-
Friday 5th July 2019 12:39 GMT Toni the terrible
Re: It's in its infancy, but it will improve
No,
My mobile phone is often left at home.
I have only the one Smart TV and it doesnt follow me around
I dont belive I have any behaviour monitoring devices, except for the PC. No home automation as it is unneeded.
I have nothing that uses facial recognition, and keep away from social media, except this site.
I block all advertising, so I am not happy to be tracked by it.
As another guy said, whats the requirement for me to carry 'papers' to be ID'd and what are those the Plod will accept? Though I do anyway - Photo Driving Licence etc.
it depends on the data set bias if the systems will be non-discriminatory, as it is if you are black yo will be picked up the system more than if you are white (AI Constable Slaughter Lives).
So, until they get it right/better than we do have legitmite concerns, so stick that up your lower oriface
-
Friday 5th July 2019 15:24 GMT Loyal Commenter
Re: It's in its infancy, but it will improve
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
Actually, it's on the desk, and location services are turned off except for the apps that I allow.
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
Nope, nope, and turned off except for when in use, when it's unlikely to be recording much of interest, as it's used mainly for streaming services, which my household members tend to shut up during.
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
Okay, so I have an activity tracker, but do you know what? I have consented to what they do with my data there, and it's constrained heavily by GDPR. It's not like I don't have a choice. As for internet connected fridge - are you actually serious? I'd sooner have Talkie Toaster in my house.
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
IIRC, on FB, you have to turn facial recognition on, whcih I certainly would not do. I also cripple a lot of FB's tracking, etc. by the use of plug-ins (FB purity for example), and ad-blocking. I don't believe Windows uses any sort of facial recognition - and good luck to it, since I only plug a camera into the thing when I need to (which is round about half past never). As for Apple. Well, just no.
* You are happy to be tracked RIGHT NOW (advertising)
NoScript, and AdBlock are prerequisites. Needless to say, not only am I not happy to be tracked by advertising cookies, I actively take measures to avoid that, as well as actively taking measures to not see the fucking things in the first place.
-
Friday 5th July 2019 15:26 GMT keith_w
Re: It's in its infancy, but it will improve
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
Do.
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
Do Not
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
Do Not
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
Do Not
* You are happy to be tracked RIGHT NOW (advertising)
Am Not
I do need the cell phone, so I put up with that. I do not need any of the other things, so I do not and will not have them in the house. And I am not happy to be tracked for advertising. Please do not assign to me your attitudes towards any of this stuff or anything else for that matter.
-
Friday 5th July 2019 16:24 GMT Anomalous Custard
Re: It's in its infancy, but it will improve
>The knuckle-dragging Luddites always get in a froth when new technology is applied.
Seems an odd insult to lob at a tech audience, but whatever.
>* You have a personal tracing device in your pocket RIGHT NOW (your phone).
Actually it's on my desk ;) And I have as much of the tracking stuff as I can turned off most of the time
>* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
Nope. The TV is dumb, I have a phobia about digital assistants, and I don't use or enable voice on my consoles
>* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
My activity trackers are tracking the activity of the drawers they're languishing in. My fridge cannot connect to the internet (unless it's gained sentience and can now walk to a computer). I do have "smart" bulbs (not my idea) - which will tell anyone tracking that we turn the lights on when it gets dark and turn them off around the same point every night.
>* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
I have facial recognition on my Windows tablet. Which actually only seems to recognise one pair of glasses and not my actual face. Or any of my other glasses.
>* You are happy to be tracked RIGHT NOW (advertising)
My ad blocker etc usage would suggest otherwise.
My tracking bingo card tells me you forgot to mention travel cards such as Oyster, bank cards, loyalty cards and online shopping.
>If you are happy with all of those (and it seems you are given the up-take), then why are you getting >your gusset in such a twist of over the Met applying technology to public safety? Oh whoops, it's a >false positive. Big deal. 30 seconds out of your day to provide ID and carry on.
I don't live in a country where carrying ID is mandatory, and as I have no need to carry it on a regular basis I don't.
>If anything it will IMPROVE matters massively for those affected by the racist stop-and-search >policies as the AI system won't have the inherent biases of the prejudicial police officers.
Hahahahanope. As others have mentioned it has difficulties telling darker skinned faces apart(*) , and add to that the biases of those who will be training it, then it's really not going to make things better.
>The technology still needs to advance, but once it have it will be a MASSIVE BOON to society by >helping us to identify risk individuals before they have would have become known by traditional >means.
We've had incidents where people have been deemed to be at risk of causing violence by those closest to them, who have reported those people to the authorities, who've not taken action. This is people being reported by those who know them well, and who are in a good position to judge change in character, behaviours etc. What on earth makes you think facial recognition will improve this? How does "this face looks vaguely like this other face" improve on "this person has become more extreme in their views and I have reason to believe they will carry out their threats of violence"?
>More importantly, it will help prevent the police from wasting their time and innocent people
>who happen to be "the wrong colour".
Only if it improves enough to be able to detect differences in all skin tones equally. Only if accuracy improves so police aren't wasting their time chasing down people who look a bit like someone else. Only if you think the police fail to apprehend the "correct" people because they're wasting time chasing after innocent people. Only if you thing human biases won't affect how human police officers interpret the results of the AI.
>Get your heads out of your collective bum.
I think it's you has your head in yours.
-
Friday 5th July 2019 21:12 GMT martinusher
Re: It's in its infancy, but it will improve
Pushing back against facial recognition is a bit of a waste of time. Facial recognition is what cops do so denying them the use of a machine that will help do this is just not going to work. Sure, facial recognition is inaccurate but its probably no worse than being identified by a witness (something that's notoriously inaccurate -- but nobody tries to ban eye witnesses).
Where you need to concentrate the fight is things like generating spurious criminal charges arising from concealing your face. Sure, its inconvenient for law enforcement that we're all neither bar coded nor microchipped like a pet but that's all part of their job -- nobody mandated that criminals and public alike were obliged to make their job easy.
-
Saturday 6th July 2019 23:22 GMT doublelayer
Re: It's in its infancy, but it will improve
"Pushing back against facial recognition is a bit of a waste of time. [...] Where you need to concentrate the fight is things like generating spurious criminal charges arising from concealing your face. [...]"
I'm not sure whether to upvote you for your last point, downvote you for your first point, or just boggle at how your last point almost directly contradicts your first point. Facial recognition equipment is in the same category as charging people for not letting them use their facial recognition equipment on you. They're two sides of the same coin, yin and yang. Since we both agree that charging people for hiding their faces is wrong, let's look at the first point. Having that equipment allows them to do the same kind of tracking. It makes it impossible for citizens to have privacy unless they specifically try to, in which case they will be charged. It is not a thing we should just accept, because in addition to it actually being illegal according to current laws, it is so unpalatable to those who like human rights that it should be made even more illegal through additional legislation.
Your comment that "Facial recognition is what cops do so denying them the use of a machine that will help do this is just not going to work" is rubbish for two primary reasons. First, there are plenty of things that cops do, and we accept, but we don't want to extend their abilities. Cops search suspects' houses for incriminating information, when they have a warrant. We could extend this by not requiring a warrant, but we don't because we don't want the police to have that power. We only want them to search places when they have a warrant to do so. Second, facial recognition is not the primary job of a police officer. Even those officers who work directly in public and not, say, investigating existing crimes aren't there to look at everyone's face and determine if they have seen it on a list. They're there to identify crimes and safety risks and deal with them. In almost all cases, they have not seen the perpetrator before, but they still go after them. If the police said they were going to throw away this system and instead employ a bunch of officers whose job it was to go to everyone and stare at their face to identify whether it's on a list, I wouldn't be any happier.
-
-
Saturday 6th July 2019 11:50 GMT Kiwi
Re: It's in its infancy, but it will improve
* You have a personal tracing device in your pocket RIGHT NOW (your phone).
Nope, often leave the phone at home, or in the car, or...
* You have listening devices in your home RIGHT NOW (Smart TV, digital assistant, games console...)
No game console, no TV, no "digital assistant".
* You have behaviour monitoring devices RIGHT NOW (activity tracker, internet connect fridge, home automation...)
Basic fridge, any "activity tracker" would die of boredom/lack of exercise, and I don't do enough to warrant "home automation".
* You are using facial recognition RIGHT NOW (Facebook, Windows, Apple...)
Linux, no social media (except El Reg), basic dumbphone
* You are happy to be tracked RIGHT NOW (advertising)
Adblockers, privacy tools and noscript.
Oh whoops, it's a false positive. Big deal. 30 seconds out of your day to provide ID and carry on.
Nope, doesn't work like that. 1) You're working on the assumption that the target is identified - what if you fit the description of a 'suspect' but where there is no identification, IE a person fitting your description was involved in a crime somewhere in your local area? That ain't gonna be 30 seconds. If the crime is serious enough and you cannot prove with absolute certainty where you were at the time (and bear in mind the pigs can change when the crime was committed on a whim, just to make sure they get you if they've taken a dislike to you -"yes, the CCTV timestamp says it was at 11:05 and the recorder's time is correct now, but since he can prove he was elsewhere at that time maybe he himself hacked it and changed it then changed it back". You're accused of a crime, think the jury will believe your claims that you couldn't possible have made such a hack? You're already guilty.
So if you're accused of a crime you didn't commit and cannot prove instantly that you didn't do it - and your only chance of that is if they get someone who looks enough like you and confesses - then you might find yourself spending a few days or even months in prison awaiting trial. All coz the computer said you matched the description.
But even if they're after an identified person, until the police test your fingerprints and perhaps DNA, you ID - assuming you're carrying at the time - isn't proof. I've got a decent scanner, and a laminator, I reckon I could probably whip up a passable fake ID in a fairly quick time (I've never tried it so maybe not). Maybe it's bloody hard to do and I'd need equipment far beyond even what Bill Gates can buy, but the coppers are going to believe that fake ID is trivial and arrest you on the chance that it's fake. Again, you're at least spending the night in pokey till they can verify your ID, or get the person they're after. Sucks to be you if you're someone who works with stuff that destroys fingerprints, as even a small amount of damage means you'll be suspected of trying to hide your prints so they'll make extra sure they have the right person before releasing them.
But never mind. So what if an innocent person spends months in jail over a computer error? That'll make the rest of the world safer. I just hope you're the next innocent person to have to spend time in prison, you might think otherwise.
If anything it will IMPROVE matters massively for those affected by the racist stop-and-search policies as the AI system won't have the inherent biases of the prejudicial police officers.
Have you considered who the system is being tested/designed by?
More importantly, it will help prevent the police from wasting their time and innocent people who happen to be "the wrong colour".
And yet the evidence from the world over says otherwise.
-
-
Friday 5th July 2019 02:06 GMT veti
The fact is, every time we see these statistics we only ever see one side, usually the false positive rate.
What was the false negative rate? Without knowing that, we don't know whether it's a good deal or not.
If you're screening a million people, and you get 42 alerts, of which 8 turn out to be correct - that means you've checked out 42 people, instead of a million, to identify your 8 targets. That's a pretty good deal.
If the original sample included another 1000 people who should have triggered matches, then - no, it's not good. But if it only included half a dozen or so, that's not bad.
-
Saturday 6th July 2019 23:28 GMT doublelayer
I'm not a downvoter, but your question is unanswerable and missing the point. Nobody knows how many people were present, as they didn't test it on that. Also, most of us here, myself included, are not that happy having a 80% rate of someone innocent being taken in for questioning on the back of a system that violates citizens' rights.
-
-
-
Thursday 4th July 2019 13:10 GMT Anonymous Coward
Re: Why do they keep saying it doesn't work?
"false non-match rates lower than 0.01, at a false match rate of 0.0001"
I'd bewary of those values, they come from a summary line that makes em look good, but quoted without understanding the underlying methods (or any reference to what they actually mean) tells you nothing of value.
does that mean that 1 in 100 criminals wont be spotted?
or that looking at 100 images if 1 was a criminal he will be missed?
and the 0.0001 does that mean that looking at 1,000,000 images 100 of those people will be mistakenly arrested?
Understanding what it means in practice is very important especially with dangerous 1984 shit like this
-
Thursday 4th July 2019 13:29 GMT Androgynous Cupboard
Re: Why do they keep saying it doesn't work?
Sizable upvote for Mr Coward and his correct understanding of the numbers
I can only add some context: roughly two million people attend the Notting Hill Carnival each year. At the false match rate quoted, that is 200 innocent people considered for arrest, based on nothing more than an algorithm, over one weekend in just one small part of the city.
-
Thursday 4th July 2019 14:44 GMT Anonymous Coward
Re: Why do they keep saying it doesn't work?
Thank you.
Also important is the officers understanding and belief in the accuracy.
For example with the jeremy kyle shit show, people were being told that polygraphs (lie detectors(they are not)) are 100% (or 90%+) a stupidly high figure. the reality is more like about 60% (pretty much useless)
BUT due to the lie of it being accurate JK and his audiences berated someone to the point of suicide, the poor guy didn't know how crap the true rate was and therefore could not defend or understand why he failed.
If police have the same wrongly placed confidence, no matter how much a suspect flagged by the false positive tries to argue his innocence, the officer will just treat him worse thinking he's a liar.
That is a very dangerous situation, imagine a false match on a terrorist to an innocent person, guns get drawn, everyone gets twitchy and bang...
-
Thursday 4th July 2019 21:07 GMT John Brown (no body)
Re: Why do they keep saying it doesn't work?
"That is a very dangerous situation, imagine a false match on a terrorist to an innocent person, guns get drawn, everyone gets twitchy and bang..."
They can even manage that without the use of expensive facial recog. Especially if you are Brazilian.
-
-
-
Thursday 4th July 2019 16:52 GMT Anonymous Coward
Re: Why do they keep saying it doesn't work?
https://www.theregister.co.uk/2018/05/15/met_police_slammed_inaccurate_facial_recognition/
https://www.independent.co.uk/news/uk/home-news/met-police-facial-recognition-success-south-wales-trial-home-office-false-positive-a8345036.html
"I have told both police forces that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use" - Professor Paul Wiles, UK Biometrics Commissioner,
-
-
Thursday 4th July 2019 16:02 GMT Electronics'R'Us
Legal Basis?
"The MPS maintains we have a legal basis for this pilot period and have taken legal advice throughout."
As you are a public body and such legal basis has no national security implications, we would love to see that actual legal basis and transcripts of your legal advice (and we definitely want to know who you got it from).
The reality appears to be that you sneaked something through but without paying attention to all the relevant legislation, some parts of which may render all your expensive advice worthless.
IANAL, but I do know technology (and probably better than the entire Met leadership team put together). AI / ML / NN is just the latest fad / buzzword (I really must get this on the grids of my bullshit bingo cards) that will not be properly ready for a long time yet.
-
Friday 5th July 2019 07:21 GMT Warm Braw
Re: Legal Basis?
The Annual Report of the Biometrics Commissioner is quite interesting in this respect. His assessment seems to be that there is no settled legal basis because currrent legislation refers only to DNA and fingerprints (and his report is fairly damning about police handling of those). He specifically says that, while there has been some legislative interest in Scotland, there has been none in Westminster because of the Brexit stasis and that the legality will be determined for the foreseeable future by case law:
Two civil liberty groups, Liberty and Big Brother Watch, have sought judicial review against South Wales Police, the Metropolitan Police and Home Office, challenging the legality of the police action. Their concern is that the mass scanning and processing of the images of people in this way in public places is not proportionate as it constitutes a significant interference with the Article 8 rights of those affected and that such interference is “not necessary in a democratic society” or “in accordance with the law” under the European Convention on Human Rights (ECHR)41. We shall have to await the court judgments, but these cases are probably only the first challenges to the police use of new biometric technologies in trials. Actual deployment of new biometric technologies may lead to more legal challenges unless Parliament provides a clear, specific legal framework for the police use of new biometrics as they did in the case of DNA and fingerprints.
-
-
Friday 5th July 2019 06:29 GMT Fruit and Nutcase
How to improve development
As part of testing the algorithms, they need to have a pool of people comprised of the various stakeholders of the project, right up to the Home Secretary. Senior Officers from the Met, designers and coders of the algorithms etc.
Now, during testing (invite representatives from the community to act as test subjects), each time the system gets it wrong, the next one from the pool of stakeholders has to endure a full blast from a taser. That should make good prime time TV and the BBC could transmit it...
-
-
Friday 5th July 2019 12:52 GMT Intractable Potsherd
Re: Face masks
The trouble is, they will probably be regarded as attempting to avoid facial recognition, and up with an arrest - like the bloke down south who was lifted for trying to hide his face (okay, technically they got him for swearing, but the precipitate was him hiding his face).
If crime was very prevalent in British society, I might have a slightly different perspective on widely deployed facial recognition (though not likely very different - it is just plain wrong), but it isn't, and so I don't. The balance of rights v responsibilities doesn't favour applying hugely privacy-infringing technology for the small benefit it would give.
-
-
This post has been deleted by its author
-
Friday 5th July 2019 14:37 GMT PapaD
Optical masking
Turns up in films and some sci-fi books.
The idea that you wear something that essentially makes it impossible for the camera to clearly see your face (either infrared LEDs - though, most decent cameras do have IR filtering), or some other form of non-visible light)
Surely can't be that long until someone comes up with something. I know i've seen at least one video of people using an image on a card that prevents at least one type of facial recognition software from detecting you're even there. No idea how viable that would be unless you knew for sure the weaknesses of the system you would be recorded by.
-
Friday 5th July 2019 19:08 GMT Anonymous Coward
Doubtful legality...
Like the researchers who installed a gait recognition enabled CCTV system they were creating in a University building without informing their dean, the relevant watchdog, campus security or even checking with the Universities ethics committee. They were mightly pissed off when a union rep called security who as the only authorised CCTV operators on site made them remove the kit.