back to article TSA wants to expand facial recognition to hundreds of airports within next decade

America's Transportation Security Agency (TSA) intends to expand its facial-recognition program used to screen US air travel passengers to 430 domestic airports in under a decade. The TSA's program, which uses Idemia's biometric technology, has come under fire from some privacy and civil-rights organizations, which argue the …

  1. Woodnag

    TSA doesn’t retain the details of people’s faces—what’s called biometric data—after the comparison is made. “Biometric data is overwritten as soon as the next passenger steps up to the queue,” Langston says. “And then, when the technology is turned off at the end of the day, whatever storage system in there dumps completely. There is no saved image.”

    But Langston acknowledges that, until this week, some of travelers’ biometric data was collected and sent...

    So, the statement "There is no saved image" was a lie. Obviously every image is sent to NSA/FBI/etc. Why not? Nobody in the surveillance chain suffers any ramifications should the truth come out.

  2. Yet Another Anonymous coward Silver badge

    misidentify women and people with darker skin

    Are they still allowed to travel without their master ?

  3. andrewj

    "Next, it verifies the person pictured on the identification card is the same person standing at the TSA podium, while also verifying the person is, in fact, traveling in the next 24 hours, and whether they have PreCheck, regular screening status, or are on a list to receive additional screening, Langston said."

    None of which require facial recognition technology.

    1. Phil O'Sophical Silver badge

      Well, it requires that a comparison be made between the ID photo and the face of the person present. The question is whether it makes any difference whether that's done by a guy in a uniform, or a machine. Personally I'd have more faith in the machine being unbiased.

      1. Wade Burchette

        The issue is not the bias on a computer; the issue is what the computer does with the image. I am certain a computer will be more accurate and less biased than a person; I am uncertain what the computer will do with the data it gathered. The TSA is a power-mad US government bureaucracy. They will use this technology in ways that are objectionable. Pesky things like the constitution or the law will not matter to the TSA. They will take away your rights and when caught breaking the law, will only change for a short time before they go back to doing what they want to do. Or even worse, they will break the law and the power mad people in the US congress will make a law making the unethical actions legal.

        You don't know the full details of the facial recognition program. A little bit here, a little bit there, and pretty soon you have no rights. A person can forget and misremember. A group of people cannot easily track your actions. A computer has none of those problems.

        1. Anonymous Coward
          Anonymous Coward

          the issue is what the computer does with the image.

          Of course, but for passport checking (e-gates, for example) what it does is process it to get a couple of dozen key points and compare them to the points stored in the biometric data on the passport (the passport data does not hold an image). There's no reason for the system to store the image at all, ever, nor even to have enough storage to be able to do so. The spokesman even says "live photos and ID photos are overwritten by the next passenger's scan, we're told. They only remain in RAM and are purged when the officer logs off or turns off the machine, which happens automatically after 30 minutes of non-use.". If the machines had the GB of storage to hold the data, or the networking to send it somewhere, it's likely that someone would have noticed & said so.

          Now they could be lying, of course, but if you're that paranoid you're better off not travelling at all. All sorts of places have info about you, CCTV (as the article says), airlines, hotels, parking, all keep names, credit card details etc.

          A group of people cannot easily track your actions. A computer has none of those problems.

          Groups of people have been tracking others for centuries before they had computers. If they want to track you, they will, paper, pen & Kodak film work just fine.

        2. Swiss Anton

          There is no privacy to lose.

          To be able to compare the facial image with the id, the system needs to already have a copy of the id image. It doesn't matter whether the airport scanner saves it recorded image or not, the authorities already have a copy of your face. They will also have all the info the require to track your journey through the airline ticketing system and banks used to processes the transaction used when purchasing the ticket. All the tech does is reduce staffing costs, and arguably improve accuracy, as the robots won't have a hangovers at the start of the day.

    2. jmch Silver badge

      Depends what you mean by 'facial recognition'.

      Verifying that a document is genuine has nothing to do with facial recognition, and can easily be automated without any problems. Verifying that the person for whom the document is valid is actually flying* is just a database lookup. Verifying that the person there is the same person as the one in the photo is something that can use 'facial recognition' that is a simple 1-to-1 comparison - is the photo I just took of this person the same as the photo on the ID? It should not require the photo taken to be stored. Nor does any of the above require any data at all to be collected / stored at the border check, or forwarded anywhere. And that is what they *claim*, except that they also say that if anything is flagged, they are forwarded to 3-letter agencies, so in theory they can forward any and all the data they want. Oversight?? Accountability?? Based on track records, I would say 'almost none that matters' and 'absolutely zero'

      *I would add the important qualifier 'flying from the same airport as the security scan'

      1. Version 1.0 Silver badge
        Joke

        Facial recognition can have an issue - illustrated ...

  4. ChoHag Silver badge
    Facepalm

    > "All of those privacy concerns have been addressed in assessments, in working with privacy advocacy groups," Langston said. We should note: the assessments are not public, so The Register can't verify the findings.

    They're *privacy* groups, duh!

    They're the ones that get to have privacy.

  5. eldakka

    "It identifies those four very key and critical elements in identity verification, which are the lynch pin for transportation security," Langston said.
    How many security incidents have occured in the last two decades that such a system would have prevented?

    1. Graham Cobb

      I suspect the answer is very close to zero, but that the motivation is less about number of security incidents and more about saving money by not training people on how to recognize whether ID matches.

      I am sure there are hundreds (across the TSA) of cases where the agent is not really sure whether the (10 year old) passport photo actually matches the person in front of them. In those cases I am sure they fall back on inherent (and probably unconscious) biases - "he looks a bit like those old photos of Bin Laden", "what would a 17yo black woman be doing with a First Class ticket", "the guy's wearing a smart business suit and carrying a laptop, no point wasting time with him".

      The real problem is that with the machines, the same people will always be having trouble: if the ID doesn't match properly according to the machine's algorithm then you are doomed to having to arrive an hour earlier for every flight to deal with the back room process for people who fail the match.

      Personally I believe the real issue is privacy: we should be able to travel completely anonymously. The reduction in risk by having to have "valid" ID is negligible: real terrorists can get any ID they need through criminal or diplomatic routes.

      1. Yet Another Anonymous coward Silver badge

        That's the idea, if Mr Winston Cudoogo, of 55 Mercer Road, is pounced on by Office Savage of the TSA everytime he flies he will eventually make an unfortunate remark at which point he becomes a genuine criminal/terrorist

  6. IGotOut Silver badge

    Long overdue a name change.

    I suggest the

    People's Democratic Republic of America

    1. LogicGate Silver badge

      Re: Long overdue a name change.

      Do not underestimate the power of the Theater of Security Agency

      https://youtu.be/IHfiMoJUDVQ

  7. Tubz Silver badge

    Hmm opt out of TSA facial ID, which reality means, you flag yourself as a subject of interest straight away and now get a thorough strip search and finger probing that real aliens only dream of, why else would you opt out ?

    1. Yet Another Anonymous coward Silver badge

      You can already opt out of most TSA security theater by paying $100 or free with the right credit card. Because the Saudis would never think of signing their terrorists up for Amex World Elite

  8. Combat Epistomologist

    What could POSSIBLY go wrong?

    Has anyone, ANYWHERE, EVER heard of a facial-recognition-for-enforcement deployment that DIDN'T go terribly wrong?

    1. ecofeco Silver badge

      I have yet to hear of facial rec. being accurate, let alone anything else, except just another backhander to someone's mates.

  9. Kevin McMurtrie Silver badge
    Big Brother

    Step aside for a moment...

    Extra screening if you're flying to LAX and crazy Uncle Sam finds calls to cosmetic surgeons in your phone records. You wouldn't be trying to evade TSA security, would you?

  10. MC

    The USA crossed the biometric rubicon years ago, I mean, nothing says welcome like being ten printed and mugshotted on arrival. Why I don't go there.

    1. Sub 20 Pilot

      Same here, last went there in 1980 or so and will never go back. Plenty of other countries take my tourist / travel money without treating me like a criminal.

      It is their country so they can do what the hell they like I suppose but nt for me.

  11. cb12345

    Boiling the frog

    In my opinion, the concerns described in this article (e.g., bias) are missing the greater threat. For me, the bigger problem is the gradual introduction of government facial recognition into public spaces without meaningful democratic involvement or oversight. My concern is a steady creep. People will get used to this, so it'll be expanded. Then facial recognition tech will be used in some other type of space/situation, and people will in turn get used to it. And so on until we're Hong Kong (if you feel that's a bit of an exaggerated conclusion, sure, conceded). We've seen similar patterns with other digital technologies, like online tracking.

    The use of facial recognition by government can be appropriate and useful, but it can also be extremely dangerous and unjust. The public getting used to it and letting it expand allows those dangerous uses to creep in. Every use of facial recognition technology should be subject to approval through democratic processes and with a focus on civil liberties to prevent 'boiling the frog.'

    (This is why I opt out when I go through airports using this - to signal (in my small way) my concern about the undiscussed undemocratic encroachment of potentially authoritarian technologies.)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like