back to article After IBM axed its face-recog tech, the rest of the dominoes fell like a house of cards: Amazon and now Microsoft. Checkmate

Microsoft said on Thursday it will not sell facial-recognition software to the police in the US until the technology is regulated by federal law. The move comes after the Windows giant faced mounting pressure to denounce face-analyzing machine-learning systems as rivals, namely IBM and Amazon, publicly dumped the tech. In IBM …

  1. Anonymous Coward
    Anonymous Coward

    I guess they will buy from China

    I guess government agencies will just buy facial recognition software from China. The only problem is that all caucasians look alike to it.

    1. Anonymous Coward
      Anonymous Coward

      Re: I guess they will buy from China

      Worse

      https://en.wikipedia.org/wiki/Clearview_AI

      1. Drew Scriver

        Re: I guess they will buy from China

        That article doesn't contain any accuracy percentages other than those from Clearview. Too bad Wikipedia doesn't provide a way to show what edits were made to an article, and by whom...

    2. Anonymous Coward
      Anonymous Coward

      Re: I guess they will buy from China

      *cough* NEC *cough*

      Used in China, London and a number of US police departments.

  2. Anonymous Coward
    Big Brother

    Not quite equivalent

    IBM said they would stop all work on facial recognition.

    Microsoft said they would stop all sales until there is federal regulation.

    Amazon only suspended sales for a year.

    I suppose any steps are better than none but I know how I would rank them.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not quite equivalent

      "Amazon only suspended sales for a year."

      So the deployments sold to date are still operating then?

      Has all the hallmarks of a mealy mouthed corporate statement.

      1. whileI'mhere

        Re: Not quite equivalent

        Amazon suspends sales for a year, during which they can figure out how make it work when everyone is wearing masks, so it becomes 'useful' again.

        1. Paul Hovnanian Silver badge

          Re: Not quite equivalent

          The Chinese have solved this problem, according to a few sources. So halt US development of this technology and their head start will be insurmountable.

      2. Only me!
        Unhappy

        Re: Not quite equivalent

        The police department with Amazon prime will be most miffed

    2. Drew Scriver

      Re: Not quite equivalent

      We'll all be wearing face masks for the next year or so anyway. Wonder if that's what the brass at MS, Amazon, and others were thinking too.

    3. Abdul-Alhazred

      Whatever they say now, I suspect they are waiting for a Democrat in the White House.

      That it will be so in a year is not a sure thing, but it's a good bet.

  3. Anonymous Coward
    Anonymous Coward

    FR is still the age old: 'Chelsea Flower Show' v 'Notting Hill Carnival'?, where it gets deployed.

    Facial Recognition is still the age-old fundamental question of 'Chelsea Flower Show' v 'Notting Hill Carnival' and where will it gets deployed based on 'intelligence' aka. Racial stereotypes, let's not hide behind words here.

    FR is a limited resource and will inevitability, be deployed based on age-old racial stereotypes. 'Targeted Policing' disproportionally against certain sections of society, en-masse.

    If anything, FR should be deployed at events like the Chelsea Flower Show first and foremost (based on stereotypes, they are the least likely to have encountered it), so those attending fully understand the psychological effects of such an intrusive technology, by others that have no fundamental consent to operate such technology. If we're talking stereotypes, let these 'influential law-abiding people' feel the heat and effects of constant surveillance of their every move 24/7.

    No one should have absolute control over others or the right to know their whereabouts, where a person is merely going about their day, whatever the colour of their skin, and certainly not targeted with technology (as this 'limited resource' will inevitability be) based on the colour of their skin/background.

    FR dehumanizes society's interactions and it undermines 'Policing by consent', it groups people by tick boxing, putting them into boxes, in more ways than one.

    Government's sold en-masse CCTV surveillance in the 1980's on the basis it 'protected you and your family', we've now silently moved into the even more dystopian 'monitoring you' for no good reason, stage.

    Remember too, we're all paying for this tech, at a time when families are more concern about sustenance, food and heat, where their next meal is coming from.

    1. IneptAdept

      Re: FR is still the age old: 'Chelsea Flower Show' v 'Notting Hill Carnival'?

      Ooooo so you are telling me there is as much crime at Chelsea flower show as Notting hill carnival (idiot)

      Clearly you have never been to Chelsea flower show.

      Or you have

      And have never been to Notting hill carnival

      You can not compare the 2 now if you said Notting hill carnival and boomtown then that may of made sense

      Now there is bias in FR but you cannot compare completely different events with different cultures and age ranges

      1. Teiwaz
        Joke

        Re: FR is still the age old: 'Chelsea Flower Show' v 'Notting Hill Carnival'?

        Ooooo so you are telling me there is as much crime at Chelsea flower show as Notting hill carnival

        There could be the potential.

        If you consider that all narcotic drugs are made from plants....

        What nefarious goings on might be cloaked in staid respectability and more tea vicar.

      2. batfink

        Re: FR is still the age old: 'Chelsea Flower Show' v 'Notting Hill Carnival'?

        I've been to both. Different kinds of crime.

        There's also the old problem of bias of finding crime where you're looking for it. If you look for criminals at the Notting Hill Carnival then you'll find some. If you don't look for criminals at the Chelsea Flower Show, then (funnily enough) you won't find any there.

      3. Jimmy2Cows Silver badge

        Re: Ooooo...

        Ooooo so you are telling me there is as much crime at Chelsea flower show as Notting hill carnival

        Maybe. Maybe not. That's really not the point being made.

        The comparison was about detecting/identifying racial bias by applying the tech to what (as you've just eloquently demonstrated) is an inherent, systemic, institutional bias toward certain types of event.

        Just because something has a veneer of respectability doesn't mean crimes aren't occuring. Unbiased use of technology would apply it equally to all venues, not selectively based on potentially biased "intelligence".

        1. batfink

          Re: Ooooo...

          And the whole point here is that the system is supposed to identify criminals, not identify crimes.

          So yes, there may be more crime at the Notting Hill Festival, but there aren't necessarily more criminals there.

        2. wdce

          Re: Ooooo...

          I thought the point was to see how the 'green fingered' would enjoy the intrusion .

    2. Pascal Monett Silver badge

      Re: FR is a limited resource

      Limited by what ?

      Up to now, it's been sold and installed at many places. Until these days, the only thing limiting FR was budget and how paranoid the person in charge was.

      And don't forget that all these companies working on FR are just halting selling to cops in the US. None of them are limiting sales outside the US.

      I'm pretty sure FR will be installed somewhere in 2020, just not under US police authority.

      1. crayon

        Re: FR is a limited resource

        "And don't forget that all these companies working on FR are just halting selling to cops in the US. None of them are limiting sales outside the US."

        The corporate weasel words also does not state that they will not sell to or work with other entities such as TLAs.

        In any case facial recognition is not going to go away. Just as in the case of ANPR - that technology is so cheap that any 2-bit organisation is able to afford it and deploy it - FR will be improved and eventually be ubiquitous.

    3. Harry Stottle

      Re:FR for the 'Chelsea Flower Show'

      Amazed at the downvotes. You seem to have attracted the dork vote.

      For the benefit of the hard of thinking, the point of the (I suspect) tongue in cheek recommendation to impose FR on the Chelsea Flower show is that it would give the largely white priveleged middle and upper class attendees a taste of what it's like to be under oppressive and intrusive surveillance.

      If serious, then although I agree with the sentiments, I would have to disagree with the recommendation because it would be hypocritical to support it for that case while opposing it in all the others. And, in contrast, my own recommendation is that anyone involved in surveillance - whether they made the laws which enable it, or are paid to implement it - should be subject to a somewhat higher level of surveillance, 24-7, for as long as they remain involved in the policy or its implementation.

      It remains the primary pre-condition required to overcome Accountability Theatre

      1. deadlockvictim

        Re: Re:FR for the 'Chelsea Flower Show'

        I had assumed that if there was FR at the Chelsea Flower Show, that it would be there to protect the attendees and their womenfolk from the flowers.

        There have been accounts of particularly vicious ones over the decades: https://www.youtube.com/watch?v=QnJkmGW8FYQ

      2. Anonymous Coward
        Anonymous Coward

        Re: Re:FR for the 'Chelsea Flower Show'

        > For the benefit of the hard of thinking, the point of the (I suspect) tongue in cheek recommendation to impose FR on the Chelsea Flower show is that it would give the largely white priveleged middle and upper class attendees a taste of what it's like to be under oppressive and intrusive surveillance.

        I'm sure that was the OP's intent but installing FR at CFS won't give the attendees a taste of being on the receiving end of oppression because their faces aren't in police databases so won't be subject to false-positive matches and any ensuing 'intelligence lead' preventative search or arrest.

        And I'm sure there is plenty of crime at Chelsea: old ladies with large handbags and sharp nail scissors are notorious. I've heard one National Trust garden manager say that some bushes can be seen to visibly shrink on a day-by-day basis.

      3. Paul Hovnanian Silver badge

        Re: Re:FR for the 'Chelsea Flower Show'

        "it would give the largely white priveleged middle and upper class attendees a taste of what it's like to be under oppressive and intrusive surveillance."

        Not so much. I may be wrong, but I doubt many attendees at the Chelsea Flower show will even notice its use. I may be showing my bias, but I doubt there are that many on the police BOLO (be on look out) lists at that event.

        On this side of the pond, in spite of local and state restrictions on FR and ANPR, its the fed's TLAs that use it quite a bit. But in most cases, they don't trigger immediate apprehensions of suspects by either local or federal agencies. It's more for intelligence gathering. So those subject to the surveillance may never know.

  4. deadlockvictim

    Headline

    Headline: After IBM axed its face-recog tech, the rest of the dominoes fell like a house of cards: Amazon and now Microsoft. Checkmate

    I'm surprised that no-one has commented on the headline.

    Do El Reg staff have a competition as how many metaphors can be mixed in one headline?

    It did make me smile though.

    1. Robert Grant

      Re: Headline

      It's from Futurama. Which was nice.

      1. Anonymous Coward
        Anonymous Coward

        Re: Headline

        Kiff! Show them the medal I won!

    2. doublelayer Silver badge

      Re: Headline

      If only it was true, though. Three companies have pulled back, leaving various others to take up the task. Amazon backed off for a time period. Microsoft wants someone to pass a law approving of them, probably for liability reasons. IBM may have actually done something, but as the article notes, they still leave some options open. This isn't check mate. Maybe a bishop's been taken, but you can still lose under those conditions.

  5. fedoraman

    What about in the UK?

    How much is FR used in the UK, does anyone know?

    I saw a story recently about how the Met police said that the wearing of facemasks in public would interfere with their FR technology.

    1. Roger Greenwood

      Re: What about in the UK?

      They should just all up their game. With everyone wearing face masks for the foreseable (years) we need that system where you zoom up on their eyes and do a retinal scan. I'm sure I've seen that in a film so it must be true :-)

      1. Jimmy2Cows Silver badge
        Joke

        Re: seen that in a film so it must be tru

        That was clearly one of those documentaries beamed from the future, disguised as a film.

  6. Anonymous Coward
    Anonymous Coward

    Microsoft said on Thursday

    why should I trust MS (and other giants), when they lied before?

    1. Sandtitz Silver badge
      Facepalm

      Re: Microsoft said on Thursday

      "why should I trust MS (and other giants), when they lied before?"

      Yeah... which person or corporation has never lied?

  7. sabroni Silver badge
    Mushroom

    re: Amazon, publicly dumped the tech.

    No, that's exactly what they've not fucking done.

    "The register, kissing the hand that feeds IT" doesn't have quite the same ring.

    What the fuck is going on around here?

    1. DJV Silver badge

      Re: re: Amazon, publicly dumped the tech.

      The change from .co.uk to .com has also resulted in a softening of the bite somewhat!

  8. Anonymous Coward
    Anonymous Coward

    Apologies?

    When IBM made this announcement everybody here ragged on them for doing it for the "wrong" reasons rather than giving them credit for both going first, and so far being the only one to be unequivocal about it.

    Now that we've got some other responses to compare against, does anybody want to grudgingly give some credit to IBM?

    1. Jimmy2Cows Silver badge

      Re: Apologies?

      Not sure anyone deserves credit for doing the right thing due to political point-scoring, bad optics and potential for consumer backlash. Especially not IBM.

  9. Drew Scriver

    And El Reg runs an ad for an iris-scanner with the article...

    I'm reading this at work, where ads are not blocked.

    The ad at the bottom of this article (for me, at least) is for a "touchless, fast, accurate" iris scanning system...

  10. Anonymous Coward
    Anonymous Coward

    Mr Angry says...

    Is that a white woman blacking up??? BOYCOT THE REGISTER NOW!!!

  11. bombastic bob Silver badge
    Unhappy

    What about face recognition for ROBOTS?

    It seems the tech has a proper use. It should be DEVELOPED. Just because SOME people/entities MIGHT abuse it, does NOT mean it should be BANNED or ABANDONED, especially because of PRESSURE from POLITICAL ACTIVISTS. Since when is there any kind of 'cancel culture' for TECH? I hope, NEVER. But it appears my hopes have been dashed by the INSANITY of the times...

    And as one commentor already pointed out (using different words): if the USA and UK and EU countries do not develop this for legit reasons, CHINA will instead.

    On a related note...

    We have had an interesting series of debates and ongoing re-runs about encryption over the last 2 decades or so. One thing that came up early on was that encryption NOT developed by 1st world nations would instead be developed by ROGUE nations, most likely to be used for nefarious purposes, and then we'd be behind the world in that particular tech. Similarly, facial recognition.

    1. doublelayer Silver badge

      Re: What about face recognition for ROBOTS?

      The "if we don't build it, they will" argument is usually stupid, and wouldn't you know it's stupid here too.

      In the case of a new type of weapon, it can make some sense. If we don't build the big bomb, they will and they will be free to use it on us because we don't have some to convince them not to. It leads to an arms race, lots of fear, and usually a bunch of time and money wasted on weapons that nobody wants to use, but at least the logic makes sense.

      In terms of cryptography, there's relatively little logic. If we don't make cryptography, then they will and they will have privacy and we don't. There's a good reason there to invent our own, but it's not because we fear the ones who have it; it's because if we don't, we won't have the advantages that the tech provides. Having our own cryptography doesn't prevent them from having it too. Privacy for everyone.

      For facial recognition, there's basically no logic. If we don't build a system to surveil our population, they will build one and surveil theirs. Er ... yeah. Many things that countries we don't like do are things we don't want done. I don't want to be surveilled. I also don't want the Chinese population to be surveilled. I can't do much about China, but just because China has decided to commit human rights abuses, that's no reason that we should start doing so too. Similarly, I bet you can find various countries that have much better torture equipment than most democracies. I don't want to compete on that either. When some country decides to invest in something harmful, the right solution is not "anything you can abuse we can abuse better" but instead to work against the places committing those abuses.

      1. bombastic bob Silver badge
        Meh

        Re: What about face recognition for ROBOTS?

        I think you missed the point about the tech itself being useful OUTSIDE of "big brother" surveillance... and AGAIN, just because some might ABuse it [such as a "big brother" gummint], you don't abandon the ENTIRE tech because of it [and, of course, because of any kind of activism].

        And, keep in mind, "those places" ALREADY do 'abuses', and nothing WE do is stopping them...

        So yeah, I want to see face recog for robots, so they can recognize YOU, your friends, and those you don't want inside your house [as an example].

        1. doublelayer Silver badge

          Re: What about face recognition for ROBOTS?

          "I think you missed the point"

          I think you missed several. Let's take some of the parts of your comment here.

          "the tech itself being useful OUTSIDE of "big brother" surveillance..."

          Worry not; we will get to that.

          "and AGAIN, just because some might ABuse it [such as a "big brother" gummint], you don't abandon the ENTIRE tech because of it [and, of course, because of any kind of activism]."

          So, if something is prone to massive abuses, we shouldn't abandon it for that reason? We abandon lots of things for that reason. We restrict lots of things for that reason. If something has a bunch of dangers, we tend to try to prevent it from being used in a dangerous way. If we can't develop a safe way to use the thing, we prevent its use. We did this with dangerous chemicals of many types. We did it with weapons which we deem too extreme for individuals to operate. We do it all the time.

          "And, keep in mind, "those places" ALREADY do 'abuses', and nothing WE do is stopping them..."

          Yes. As I said, I can't do much about that. However, that someone else does so isn't a good argument for us doing so as well. So few people are paying attention to that at all, and some other people see a repressive government and think that's the right way

          "So yeah, I want to see face recog for robots, so they can recognize YOU, your friends, and those you don't want inside your house [as an example]."

          Terrible example. We're talking about completely different tech:

          Your example: Recognize a face from a small list of prelearned faces, from an individual person who can stay in one place during the scan, from a camera that can move to get a more accurate image, without very much background noise.

          The tech under discussion: Recognize a face from a massive set of faces (from several thousand to millions) from people in a crowded area, where the person does not move to show you most of their face, where the camera cannot move to capture a more accurate picture because it would thereby miss people it turned away from, massive amounts of background noise.

          One is very different than another. The first has a few uses, and we actually have that tech already to a great extent. The latter, not that many uses that don't prove oppressive.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like