back to article US cop body cam maker says it won't ship face-recog tech in its kit? Due to ethics? Did we slip into a parallel universe?

Axon, the largest supplier of body cameras to America's cops, will not add facial-recognition technology to its gear anytime soon, it announced Thursday. Formerly known as Taser, Axon had asked its AI and Policing Technology Ethics Board – made up of engineers, social scientists, and lawyers – to mull over the impact of …

  1. Cincinnataroo

    You say that it's bad for women of colour.

    On the contrary it's good for them 'cause it messes up more often for them. Particularly good if you're a criminal woman of colour in fact.

    If it goes to court your lawyer just says: "They used the Axon Disaster to identify my client m'lud. Nuff said. We're launching a counter suit for knowing harassment of the generally innocent and being dick heads."

    1. Kernel

      If you don't accidently get shot first 'just in case' and it goes to court your lawyer just says: "They used the Axon Disaster to identify my client m'lud. Nuff said. We're launching a counter suit for knowing harassment of the generally innocent and being dick heads."

      FTFY.

      There is little value to the victim in a posthumous apology, no matter how sincere it is.

  2. veti Silver badge

    Note this is only possible because Axon is in a privileged position within the market, it has a near monopoly on this equipment and cozy relationships with police departments.

    If anyone were realistically placed to compete, they'd see this as an opportunity. But it'd take a competitor at least a couple of years to build themselves into that sort of position, and long before then Axon will see them coming and, presumably, reassess their ethics evidence.

  3. Anonymous Coward
    Anonymous Coward

    Ethics. Now there's a concept.

    Is Axon's justification:

    1. Because facial recognition in supposedly free society is evil?

    2. Because facial recognition whilst legal, is controversial, maybe even taboo?

    3. Because it irrevocably alters the power balance between the state and the general population in favour of the state?

    Or is it in reality, absolutely nothing to do with ethics, and since facial recognition technology is immature, they are liable to get sued off the planet when their software makes a mistake? Is lying unethical? Or only if you get caught?

  4. Anonymous Coward
    Anonymous Coward

    "Ethical" technology

    I often wonder if eventually, technological advances make detection of any criminal activity possible, and therefore zero tolerance to crime becomes a potential reality, whether at that point, the concept of "free society" becomes moot and we effectively end up in a dictatorship run by those with the power of enforcement. No different to non-free societies today.

    Is that really what we want?

    Is it that resistance really is futile and the future is actually Borg?

  5. Pascal Monett Silver badge
    WTF?

    Wait a minute

    How come all of a sudden we have an important company that says that face recognition tech is not reliable enough ?

    I seem to recall a slew of articles this year touting how FR is being implemented in plenty of places, mostly airports, and there were glowing articles about them.

    Is all that a bunch of malarky then ? Or have I somehow unknowingly been Fringed into a parallel universe where reality is suddenly better ?

    Or did someone patch the Matrix ?

    1. Jimmy2Cows Silver badge

      Re: Wait a minute

      Sounds more like a litigation risk being spun as 'ethics'.

    2. Lee D Silver badge

      Re: Wait a minute

      FR has never been reliable.

      The Met and other UK police have abandoned trials because there were just too many false positives. Sure, you "catch" people, but no better than flipping a coin.

      At one point it had been deployed for two years and not one arrest had resulted from the facial recognition side of it.

      Look around, everyone's been saying that FR is useless. Of course it is.

      "Liberty believes South Wales Police has used facial recognition the most of the three forces, at about 50 deployments, including during the policing of the Champions League final in Cardiff in June 2017, where it emerged that, of the 2,470 potential matches made, 92% (2,297) were wrong."

      It's basically tolerated, because it's just so shite that it's not really a privacy threat at all.

      The second someone says "my incredibly-high-processing-requirement, fixed-biometric-base, easily-fooled, statistically-insignificant, false-positives-as-well-as-false-negatives, computer-vision-based system is secure", just laugh at them. Fingerprints - not so easily fooled, but still computer-vision based and thus far from infallible. DNA just about qualifies but isn't as perfect as you think. Facial-recognition - laughable, on the same realm as "vocal-signature-recognition". Even iris-recognition isn't really that good and is incredibly inconvenient.

      If you see it used in Hollywood movies, it's a load of junk (because they only use stuff that looks pretty, not what people actually *use*).

      If you can pay with it in a casino, it's probably at least somewhat effective (e.g. credit card smart chips, etc.)

      If you wouldn't trust your servers using it to encrypt their bootloader, steer clear (i.e. literally everything except long passphrases and security keys). Even there, far from perfect.

      Can you imagine putting facial recognition as the "admin" login to your company's private servers, for example? You wouldn't last long.

    3. JohnFen

      Re: Wait a minute

      "Is all that a bunch of malarky then ?"

      Yes, it's a bunch of malarky. The current state of face recognition is better than it ever has been before, but it's still pretty poor. Those news articles that extol how great it is are wrong (and usually they're just PR pieces disguised as news reports -- those are actively lying).

    4. martinusher Silver badge

      Re: Wait a minute

      I'd guess the problem is really that they don't have access to appropriate technology at the price point they're willing to pay. You give the Marketing people a sow's ear and they make an ethical silk purse from it. Win/win.

      As for facial recognition technology not being particularly accurate its probably at least as good as the currently used eyeball system -- witness eye recognition is notoriously bad but we still convict people of crimes based on eyewitness testimony. Facial recognition at least has known errors so we know that if it gives a positive match then its just an indicator, not proof. The danger then is just the temptation to fit the recognized person up but I suppose that will never happen in practice since it never happens with other evidence.

      The biggest danger is that facial recognition not only works quite well but is likely to get a whole lot better very quickly. Its already proving to be usable for immigration control (there are pilot programs in the US). But I'd guess that we don't want to alarm people too much, much less start a fashion in IR confusing clothing and body decor.

      1. JohnFen

        Re: Wait a minute

        "its probably at least as good as the currently used eyeball system"

        We aren't really using the "eyeball system" for the things we're using face recognition for.

        "we know that if it gives a positive match then its just an indicator, not proof"

        In the US, anyway, that doesn't matter. Law enforcement will treat it as proof and act accordingly.

        "Its already proving to be usable for immigration control"

        It is? I haven't seen that evidence.

  6. Mike Moyle

    Of course...

    ...not shipping cameras with FR now just means that it can be provided as an add-on at a Slight Additional Charge™ later.

    Cynical...? Moi...?

  7. DropBear

    Assuming we as a race somehow inexplicably fail to destroy ourselves for long enough, there will be a moment in the (far, far) future where access to information by those who govern people - things like what should NOT be collected, what should NOT be searchable under ANY conditions - will be strictly regulated and limited as much as possible, with a draconian rigour; this will be happening* as a reaction to the utterly, unspeakably horrible atrocities committed earlier, enabled by large scale availability and collection of any and all data about everything - a thing that needs to happen first, so people have something to point at in horror saying "never again**!".

    * That is, of course, assuming it doesn't all go the Orwell route where the (im)balance of power breaks so completely that removing said boot from people's faces becomes truly impossible forevermore.

    ** Also assuming we don't just flat out end up denying any of it ever happened after a single generation or two, encouraged by those who very much like having all that data and immense power and would very much love to see it happen again.

    1. Anonymous Coward
      Anonymous Coward

      Impressive amount of optimism in your post.

      Unfortunately, it doesn't chime with any available evidence.

  8. Anonymous Coward
    Anonymous Coward

    Ethics?

    Ithnt Ethics that plaith Northeatht of London?

    Yeth, I have a thpeech impediment. Why do you athk?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like