
I guess they will buy from China
I guess government agencies will just buy facial recognition software from China. The only problem is that all caucasians look alike to it.
Microsoft said on Thursday it will not sell facial-recognition software to the police in the US until the technology is regulated by federal law. The move comes after the Windows giant faced mounting pressure to denounce face-analyzing machine-learning systems as rivals, namely IBM and Amazon, publicly dumped the tech. In IBM …
Facial Recognition is still the age-old fundamental question of 'Chelsea Flower Show' v 'Notting Hill Carnival' and where will it gets deployed based on 'intelligence' aka. Racial stereotypes, let's not hide behind words here.
FR is a limited resource and will inevitability, be deployed based on age-old racial stereotypes. 'Targeted Policing' disproportionally against certain sections of society, en-masse.
If anything, FR should be deployed at events like the Chelsea Flower Show first and foremost (based on stereotypes, they are the least likely to have encountered it), so those attending fully understand the psychological effects of such an intrusive technology, by others that have no fundamental consent to operate such technology. If we're talking stereotypes, let these 'influential law-abiding people' feel the heat and effects of constant surveillance of their every move 24/7.
No one should have absolute control over others or the right to know their whereabouts, where a person is merely going about their day, whatever the colour of their skin, and certainly not targeted with technology (as this 'limited resource' will inevitability be) based on the colour of their skin/background.
FR dehumanizes society's interactions and it undermines 'Policing by consent', it groups people by tick boxing, putting them into boxes, in more ways than one.
Government's sold en-masse CCTV surveillance in the 1980's on the basis it 'protected you and your family', we've now silently moved into the even more dystopian 'monitoring you' for no good reason, stage.
Remember too, we're all paying for this tech, at a time when families are more concern about sustenance, food and heat, where their next meal is coming from.
Ooooo so you are telling me there is as much crime at Chelsea flower show as Notting hill carnival (idiot)
Clearly you have never been to Chelsea flower show.
Or you have
And have never been to Notting hill carnival
You can not compare the 2 now if you said Notting hill carnival and boomtown then that may of made sense
Now there is bias in FR but you cannot compare completely different events with different cultures and age ranges
Ooooo so you are telling me there is as much crime at Chelsea flower show as Notting hill carnival
There could be the potential.
If you consider that all narcotic drugs are made from plants....
What nefarious goings on might be cloaked in staid respectability and more tea vicar.
I've been to both. Different kinds of crime.
There's also the old problem of bias of finding crime where you're looking for it. If you look for criminals at the Notting Hill Carnival then you'll find some. If you don't look for criminals at the Chelsea Flower Show, then (funnily enough) you won't find any there.
Ooooo so you are telling me there is as much crime at Chelsea flower show as Notting hill carnival
Maybe. Maybe not. That's really not the point being made.
The comparison was about detecting/identifying racial bias by applying the tech to what (as you've just eloquently demonstrated) is an inherent, systemic, institutional bias toward certain types of event.
Just because something has a veneer of respectability doesn't mean crimes aren't occuring. Unbiased use of technology would apply it equally to all venues, not selectively based on potentially biased "intelligence".
Limited by what ?
Up to now, it's been sold and installed at many places. Until these days, the only thing limiting FR was budget and how paranoid the person in charge was.
And don't forget that all these companies working on FR are just halting selling to cops in the US. None of them are limiting sales outside the US.
I'm pretty sure FR will be installed somewhere in 2020, just not under US police authority.
"And don't forget that all these companies working on FR are just halting selling to cops in the US. None of them are limiting sales outside the US."
The corporate weasel words also does not state that they will not sell to or work with other entities such as TLAs.
In any case facial recognition is not going to go away. Just as in the case of ANPR - that technology is so cheap that any 2-bit organisation is able to afford it and deploy it - FR will be improved and eventually be ubiquitous.
Amazed at the downvotes. You seem to have attracted the dork vote.
For the benefit of the hard of thinking, the point of the (I suspect) tongue in cheek recommendation to impose FR on the Chelsea Flower show is that it would give the largely white priveleged middle and upper class attendees a taste of what it's like to be under oppressive and intrusive surveillance.
If serious, then although I agree with the sentiments, I would have to disagree with the recommendation because it would be hypocritical to support it for that case while opposing it in all the others. And, in contrast, my own recommendation is that anyone involved in surveillance - whether they made the laws which enable it, or are paid to implement it - should be subject to a somewhat higher level of surveillance, 24-7, for as long as they remain involved in the policy or its implementation.
It remains the primary pre-condition required to overcome Accountability Theatre
I had assumed that if there was FR at the Chelsea Flower Show, that it would be there to protect the attendees and their womenfolk from the flowers.
There have been accounts of particularly vicious ones over the decades: https://www.youtube.com/watch?v=QnJkmGW8FYQ
> For the benefit of the hard of thinking, the point of the (I suspect) tongue in cheek recommendation to impose FR on the Chelsea Flower show is that it would give the largely white priveleged middle and upper class attendees a taste of what it's like to be under oppressive and intrusive surveillance.
I'm sure that was the OP's intent but installing FR at CFS won't give the attendees a taste of being on the receiving end of oppression because their faces aren't in police databases so won't be subject to false-positive matches and any ensuing 'intelligence lead' preventative search or arrest.
And I'm sure there is plenty of crime at Chelsea: old ladies with large handbags and sharp nail scissors are notorious. I've heard one National Trust garden manager say that some bushes can be seen to visibly shrink on a day-by-day basis.
"it would give the largely white priveleged middle and upper class attendees a taste of what it's like to be under oppressive and intrusive surveillance."
Not so much. I may be wrong, but I doubt many attendees at the Chelsea Flower show will even notice its use. I may be showing my bias, but I doubt there are that many on the police BOLO (be on look out) lists at that event.
On this side of the pond, in spite of local and state restrictions on FR and ANPR, its the fed's TLAs that use it quite a bit. But in most cases, they don't trigger immediate apprehensions of suspects by either local or federal agencies. It's more for intelligence gathering. So those subject to the surveillance may never know.
Headline: After IBM axed its face-recog tech, the rest of the dominoes fell like a house of cards: Amazon and now Microsoft. Checkmate
I'm surprised that no-one has commented on the headline.
Do El Reg staff have a competition as how many metaphors can be mixed in one headline?
It did make me smile though.
If only it was true, though. Three companies have pulled back, leaving various others to take up the task. Amazon backed off for a time period. Microsoft wants someone to pass a law approving of them, probably for liability reasons. IBM may have actually done something, but as the article notes, they still leave some options open. This isn't check mate. Maybe a bishop's been taken, but you can still lose under those conditions.
When IBM made this announcement everybody here ragged on them for doing it for the "wrong" reasons rather than giving them credit for both going first, and so far being the only one to be unequivocal about it.
Now that we've got some other responses to compare against, does anybody want to grudgingly give some credit to IBM?
It seems the tech has a proper use. It should be DEVELOPED. Just because SOME people/entities MIGHT abuse it, does NOT mean it should be BANNED or ABANDONED, especially because of PRESSURE from POLITICAL ACTIVISTS. Since when is there any kind of 'cancel culture' for TECH? I hope, NEVER. But it appears my hopes have been dashed by the INSANITY of the times...
And as one commentor already pointed out (using different words): if the USA and UK and EU countries do not develop this for legit reasons, CHINA will instead.
On a related note...
We have had an interesting series of debates and ongoing re-runs about encryption over the last 2 decades or so. One thing that came up early on was that encryption NOT developed by 1st world nations would instead be developed by ROGUE nations, most likely to be used for nefarious purposes, and then we'd be behind the world in that particular tech. Similarly, facial recognition.
The "if we don't build it, they will" argument is usually stupid, and wouldn't you know it's stupid here too.
In the case of a new type of weapon, it can make some sense. If we don't build the big bomb, they will and they will be free to use it on us because we don't have some to convince them not to. It leads to an arms race, lots of fear, and usually a bunch of time and money wasted on weapons that nobody wants to use, but at least the logic makes sense.
In terms of cryptography, there's relatively little logic. If we don't make cryptography, then they will and they will have privacy and we don't. There's a good reason there to invent our own, but it's not because we fear the ones who have it; it's because if we don't, we won't have the advantages that the tech provides. Having our own cryptography doesn't prevent them from having it too. Privacy for everyone.
For facial recognition, there's basically no logic. If we don't build a system to surveil our population, they will build one and surveil theirs. Er ... yeah. Many things that countries we don't like do are things we don't want done. I don't want to be surveilled. I also don't want the Chinese population to be surveilled. I can't do much about China, but just because China has decided to commit human rights abuses, that's no reason that we should start doing so too. Similarly, I bet you can find various countries that have much better torture equipment than most democracies. I don't want to compete on that either. When some country decides to invest in something harmful, the right solution is not "anything you can abuse we can abuse better" but instead to work against the places committing those abuses.
I think you missed the point about the tech itself being useful OUTSIDE of "big brother" surveillance... and AGAIN, just because some might ABuse it [such as a "big brother" gummint], you don't abandon the ENTIRE tech because of it [and, of course, because of any kind of activism].
And, keep in mind, "those places" ALREADY do 'abuses', and nothing WE do is stopping them...
So yeah, I want to see face recog for robots, so they can recognize YOU, your friends, and those you don't want inside your house [as an example].
"I think you missed the point"
I think you missed several. Let's take some of the parts of your comment here.
"the tech itself being useful OUTSIDE of "big brother" surveillance..."
Worry not; we will get to that.
"and AGAIN, just because some might ABuse it [such as a "big brother" gummint], you don't abandon the ENTIRE tech because of it [and, of course, because of any kind of activism]."
So, if something is prone to massive abuses, we shouldn't abandon it for that reason? We abandon lots of things for that reason. We restrict lots of things for that reason. If something has a bunch of dangers, we tend to try to prevent it from being used in a dangerous way. If we can't develop a safe way to use the thing, we prevent its use. We did this with dangerous chemicals of many types. We did it with weapons which we deem too extreme for individuals to operate. We do it all the time.
"And, keep in mind, "those places" ALREADY do 'abuses', and nothing WE do is stopping them..."
Yes. As I said, I can't do much about that. However, that someone else does so isn't a good argument for us doing so as well. So few people are paying attention to that at all, and some other people see a repressive government and think that's the right way
"So yeah, I want to see face recog for robots, so they can recognize YOU, your friends, and those you don't want inside your house [as an example]."
Terrible example. We're talking about completely different tech:
Your example: Recognize a face from a small list of prelearned faces, from an individual person who can stay in one place during the scan, from a camera that can move to get a more accurate image, without very much background noise.
The tech under discussion: Recognize a face from a massive set of faces (from several thousand to millions) from people in a crowded area, where the person does not move to show you most of their face, where the camera cannot move to capture a more accurate picture because it would thereby miss people it turned away from, massive amounts of background noise.
One is very different than another. The first has a few uses, and we actually have that tech already to a great extent. The latter, not that many uses that don't prove oppressive.