back to article Error prone, insecure, inevitable: Say hello to today's facial recog tech

Facial recognition technology represents a valuable, and likely inevitable, method of identification for cops and Feds. Unfortunately, it's largely unregulated, error prone, and insecure. During a hearing held by the US House Committee on Oversight and Government Reform on Wednesday, Chairman Jason Chaffetz (R-Utah) …

  1. Paul Crawford Silver badge

    What?

    " the faces of 125 million US adults have been stored in criminal facial recognition databases"

    Is my arithmetic, etc, wrong or is that about half the US adult population?

    1. vir

      Re: What?

      The article says "law enforcement face recognition affects over 117 million American adults" and it looks like most of them are just driver's license photo databases from the states that allow it.

      1. Anonymous Coward
        Anonymous Coward

        Re: What?

        If your state tells you you can't smile for your driver's license photo, that's because it is going in a facial recognition database.

        1. Anonymous Coward
          Anonymous Coward

          Re: What?

          If your state tells you you can't smile for your driver's license photo, that's because it is going in a facial recognition database.

          That's probably why they have to keep life so depressing for us: if you were laughing, the facial recognition wouldn't work. Bring on the beer, I say..

          :)

          1. Anonymous Coward
            Anonymous Coward

            Re: What?

            So the past 50 years of making the DMV the most miserable place on Earth was a deliberate strategy, to prepare for the coming of putting DL photos in a facial recognition database? I guess I need to rethink my opinion of the intelligence of DMV employees, they were obviously way ahead of the rest of us!

  2. James 51
    Holmes

    I want to report a crime, I was mugged and he punched me in the face.

    Hmm my body camera is telling me you are the infamous theif Big Nose. You're nicked.

    What!

  3. HAL-9000

    Those aren't mistakes !

    Chaffetz said the technology makes mistakes, with one in seven FBI facial recognition searches incorrectly returning a list of innocent people as matches, despite the presence of the actual matching image in the database.

    That is proof of the malevolent intent of some controlling AI, that obviously didn't like the look of those others that were 'matched' ;)

  4. Milton

    The Usual Idiots

    Predictably, this is driven by the same vested interests who brought you mass surveillance (so "mass" that agencies are drowning in so much data they cannot analyse it usefully) and a vast assortment of other wonderful technologies whose false-positive rate is so high as to render them not just ineffective but downright counterproductive.

    The Usual Suspects are -

    1. The marketurds who help to hype and sell expensive tech and gadgets like body scanners, radiation detectors, facial recog and the rest.

    2. Security and law enforcement agencies who willlingly swallow this crap because even if it doesn't work, it inflates budgets and builds empires.

    3. The politicians whose childlike credulousness seems to grow from one year to the next, who nod meekly every time they're asked for cash by the previously-mentioned, and who, after all, are much happier to take dollars that might help ordinary people (say, for education and health) and use them for controlling or killing them instead, via surveillance and weapons programs. It appears to be a brain-stem reflex, though I use "brain" cautiously ....

    So you end up with mass rollout of facial recognition which is so pitifully easy to defeat and deceive that it isn't even funny. I won't go into all the scores of simple tricks widely known for fooling recog systems - anyone who knows their functional principles, which will include most Reg readers, can figure it out - but we're already into the next generation of the arms race with systems attempting upgrades to recognise minor prostheses, various types of contact lens, hair tint, nose/cheek inserts, and even digitally designed reflective makeup ... things will get worse for a long time, until someone eventually realises (as if for the first time) that it's much more cost-effective to deploy focused good-ol'-shoe-leather humint and police work than to keep relying on lazy, expensive and ultimately self-defeating technological magic bullets.

    The contractors and military "test" things like anti-missile systems or spectacularly incapable trillion-dollar stealth fighters by offering up ideal unrealistic scenarios almost guaranteed to issue a "pass". Facial recog and mass surveillance fall into the same category and we shouldn't be surprised when it turns out that when you most need these magical technologies - they're pretty much useless.

    1. paulf
      Boffin

      Re: The Usual Idiots

      @Milton1. "The marketurds who help to hype and sell expensive tech and gadgets like body scanners, radiation detectors, facial recog and the rest.

      2. Security and law enforcement agencies who willlingly swallow this crap because even if it doesn't work, it inflates budgets and builds empires.

      3. The politicians whose childlike credulousness seems to grow from one year to the next, who nod meekly every time they're asked for cash by the previously-mentioned,"

      Without wanting to absolve anyone in your accurately described charade, lets consider two possible outcomes where something awful happened and the junk pushed by 1. hadn't been used.

      i) 2 had turned away the junk pedallers in 1 who then drop a suitable leak/advert to the media pointing out the attacker could have been stopped had their junk been bought and used properly (whether true or not). 3 then demand heads of 2 and offer a big budget increase to buy even more junk from 1 because "someone must do something".

      ii) 3 had declined 2's request for a massive cheque to buy the junk from 1. People want to know why 3 didn't make funds available amongst a media circus of calling for heads from 3. 3 demands heads from 2 as a smoke screen, approves funding to 2 at the same time so they can buy lots of junk from 1.

      With that in mind it's no wonder 2 and 3 end up complicit with 1 - they really don't know what they're doing or how to stop the bad things (in a way that's understood by voters) and the shiny shiny junk pushed by 1 looks like a great way to be seen to be doing something, regardless of its efficacy.

  5. Anonymous Coward
    Anonymous Coward

    Facial recog and mass surveillance fall into the same category and we shouldn't be surprised when it turns out that when you most need these magical technologies - they're pretty much useless.

    "Useless" is a matter of perception when it comes to unicorn technology: it serves to make some people very rich indeed. Remember the golf ball detector scam?

  6. John Smith 19 Gold badge
    FAIL

    ""We do not need to choose between safety and privacy. Americans deserve both,""

    Should read

    "We do not need to choose between safety and privacy. People deserve both,"

    FTFY,

    Just to be clear

    "with one in seven FBI facial recognition searches incorrectly returning a list of innocent people as matches, despite the presence of the actual matching image in the database. "

    means the software is producing both a false negative (in DB but not listed as a result) and

    multiple false positives (in DB but not the correct person).

    Which sounds like an epic fail level of result.

    On that basis a file of only known criminals should be used.

  7. Sgt_Oddball

    125 million...

    Of which only about 5% are in prison or on parole. That's a hell of a lot of people who really probably shouldn't be in that data base.

    It also doesn't state if that includes foreign nationals or passport photos or clearly denies those either.

  8. Anonymous Coward
    Anonymous Coward

    Look , the machine isnt connected to the electric chair. If the machine narrows down the suspect list from 125 million to half a dozen , its fairly easy for a humble meatbag to vet the choices and see if any look genuine.

    To illustrate this principle: Say sherlock homes finds a footprint outside a window at the scene of the crime , he is better following that lead than dismissing it as " oh well anyone can buy a Dr marten, theres 10 million sold so far"

    1. genghis_uk

      The issue is not of following up on leads but the shift of onus.

      You should be innocent until proven guilty but we have already seem problems with DNA where a jury is convinced by the science. If your face is matched by a computer the responsibility will be on you to prove your innocence as the tech already says you are guilty. That is a slippery slope!

      1. John Smith 19 Gold badge
        Big Brother

        "The issue is not of following up on leads but the shift of onus."

        Correct, hence my suggestion that it only be used on a DB of known criminals.

        On the fine old Stalinst view that "We know you've done something, even if it's not what we're charging you with."

        But personally a system that 14% that can't match someone who's in the database but will throw out a bunch of total strangers is in fact a machine for wasting police man power or generating wrongful convictions.

        Both of which should be grounds for extreme concern in a civilized society but probably aren't in a police state.

        1. Prst. V.Jeltz Silver badge

          Re: "The issue is not of following up on leads but the shift of onus."

          yes , but unlike DNA evidence the Jury can look at the pictures and decide for themselves if its the same person

          1. Kiwi

            Re: "The issue is not of following up on leads but the shift of onus."

            yes , but unlike DNA evidence the Jury can look at the pictures and decide for themselves if its the same person

            Are you sure about that? Think of a dozen of the worst users you've ever known.. Those are "a jury of your peers" I'm afraid.

            A jury is, by definition, made up of those who couldn't figure out how to get out of jury service, or of those who actually want to be there. Not sure which case would be the worst (though that said the one time I did go to JS I decided to go through the whole experience).

    2. Cynic_999

      "

      If the machine narrows down the suspect list from 125 million to half a dozen , its fairly easy for a humble meatbag to vet the choices and see if any look genuine.

      "

      If it only has false positives, that would be true. But what about the false negatives? That will result in the police not taking a second look at the criminal who the computer has "eliminated".

      In reality, you'd be better off selecting the suspects on the basis of height and obvious ethnicity, which are far less likely to have false negatives.

  9. Cynic_999

    Snake oil

    I did a fair bit of work on facial recognition some years ago. It is very useful in flagging a match or non-match between two sources - say a passport photo and the image of the person carrying that passport. Or as secondary confirmation after a person has identified themselves by other means. We got it down to about a 5% error rate, which is not good enough for complete automation, but makes it useful for flagging a possible mismatch to a bored official at a passport control station so he takes a more careful look. It tends to spot differences that a human doesn't, while a human sees differences that a computer algorithm is poor at differentiating.

    What it cannot do (and I doubt ever will be able to do), is matching a face to one in a large database. Far too many false positives (unless you increase the match criteria, in which case you get so many false negatives that it's useless).

    The fact is that there are lots of people whose features are sufficiently alike that they can even fool a human, and when you have to factor in the fact that people routinely change their appearance with beards, hair style, makeup, facial expression and the effects of aging, any comparison algorithm has to be pretty loose, and will then find a match between anyone who have similar ratios between key elements (eyes, ears, nose, chin etc.)

    It's pretty easy to defeat as well if you know you are going to be scanned for a match with a set of known suspects. Just open your jaw an extra centimetre or two (part your teeth without opening your mouth), and it screws up all measurements involving your chin. Smile slightly or purse your lips a bit and that will throw the part that looks at lips. Grow your hair over your ears (or wear a suitable hat), and another datum point is lost. Flare your nostrils ...

    As a method of finding people in a crowd who match images held in a large database, it is a complete non-starter.

  10. John Smith 19 Gold badge
    Unhappy

    And then there are the "identical" strangers.

    Two people who are mistaken for each other but are in fact not genetically related.

    Interestingly some matched on FR, some did not.

    Obviously someone who matched on both but you were not related to, and who committed a crime you have no alibi for would drop you right in it, unless they left DNA which was not yours, but then the police would say you were just careful.

    Given the plods fondness for Occams Razor you're chances of getting out of this would be quite slim unless they committed more crimes.

    1. Prst. V.Jeltz Silver badge

      Re: And then there are the "identical" strangers.

      "you have no alibi for"

      Whats the odds? The false positive probably lives hundreds of miles away, was at work at the time , etc ad infinitum. Most crimes dont come down to a single piece of evidence.

      Its a tool in the toolbox ffs - to be used in combination with other things and at a suitable time.

      If he only evidence is a lead pipe in the library - the face recog may not help - that dosent mean its uesless

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like