back to article Don't believe the hype that AI-generated 'master faces' can break into face recognition systems any time soon

The idea of so-called “master faces,” a set of fake images generated by machine learning algorithms to crack into facial biometric systems by impersonating people, made splashy headlines last week. But a closer look at the research reveals clear weaknesses that make it unlikely to work in the real world. “A master face is a …

  1. Mike 137 Silver badge

    A poor justification

    "... in spite of these limitations, LFW is a widely used dataset in the academic literature ..."

    I find this increasingly in non-outstanding scientific papers - reliance on "previous work" rather than employing methods and data sets specifically selected to further the ostensible objective - it's particularly common in statistical analyses, where often quite inappropriate methods are employed.

    Unless this work was specifically intended to compare the method under test with other methods applied to the same data set, the only obvious justifications for using a data set with a significant recognised deficiency could be "it's available" or "it improves our chances of passing peer review". Neither is unduly conducive to production of convincing results.

  2. Tony W

    Be careful what you look like

    OK the research is flawed and alarmist - but the basic idea looks as if it could be feasible so long as you are content to target a limited group of the population.

    This highlights the fact that biometrics must be really good if it's to be used for anything important.

    1. Pascal Monett Silver badge
      Trollface

      Re: Be careful what you look like

      What the research really demonstrates is that, in our AI future, if you're white, you're screwed because everything will recognize you.

      You want privacy and security ?

      Better off coloured.

    2. Cuddles

      Re: Be careful what you look like

      "This highlights the fact that biometrics must be really good if it's to be used for anything important."

      Indeed. The problem with biometrics that always seems to be missed is that it's still essentially just a password. How accurately it can identify a specific individual depends very strongly on how accurately you sample the biowhatsit in question. Use a small number of fit points, low resolution image, or whatever, and you're doing the equivalent of using a 4-digit PIN and then acting surprised when it's not as secure as a 20-character randomly generated string.

      The basic idea really isn't feasible, or at least it shouldn't be. If your facial recognition system matches a generic computer-generated face to 40% of old white men, that doesn't mean that there's something inherently flawed about facial recognition, it just means your system is shit. All this research really highlights is that many facial recognition systems in use today are, in fact, shit. But that's not news to anyone who has actually been paying attention to all the "AI" nonsense doing the rounds these days.

      1. druck Silver badge

        Re: Be careful what you look like

        Biometrics aren't passwords, you can't chose another one.

        1. imanidiot Silver badge

          Re: Be careful what you look like

          Exactly this. Biometrics isn't a password, if anything it's a username.

  3. Pascal Monett Silver badge
    Mushroom

    “three leading deep face recognition systems”

    Deep face ? What the hell is that expression for ?

    I get deep fake. Statistical analysis applied to creating an image (or video) and inserting another one. Okay, that's fine.

    But there is no deep face. There is facial recognition, period.

    Stop gargling yourselves with meaningless verbiage just to make you seem capable.

  4. TheInquisitor

    Biased Sample Group

    TBF the sample group is biased, however biased correctly for impersonating the most high value access.

    Sad but true: old white dude have the most access.

    Seems like the entire dissent here hinges on only being proven effective with a biased sample group. That biased group however, happens to be the most likely to group to have highest levels of access.

    The point of the study was that it's a clearly proven attack vector. You challenge other authors for sensationalizing their headlines that contradict yours, when you've clearly done the exact same. But then you even picked a very narrow, irrelevant, lane to challenge the research on.

    The worst you can really say is that more work would need to be done, in order to impersonate non old white dudes... But if old white dudes hold most of the power... Why would attackers need to even bother?

    I think you missed the point entirely, which is why your headline and content look nothing like the other articles.

  5. elsergiovolador Silver badge

    Prime Time

    So these technologies are not yet ready for prime time and yet somehow the sales people managed to convince barely literate people in power that they can fulfil their authoritarian fantasies. How is that not considered as a scam?

  6. Filippo Silver badge

    Biometrics for authentication

    Why are we using biometrics for authentication? Honest question. It's like a password that you cannot change, and that you leave everywhere you go. How is that a good idea?

    1. A. Coatsworth Silver badge
      Trollface

      Re: Biometrics for authentication

      Because it is THE FUUUUTUUUUREEEE. They used it in Minority Report!

      Don't you wanna be as cool as Tom Cruise? Don't you?

    2. jtaylor

      Re: Biometrics for authentication

      "Why are we using biometrics for authentication? "

      Indeed. I used to think of passwords as confirming identity. I now think of them as confirming consent.

      Here is my credit card. Check it against my photo ID or other biometric. Show me the charge, and I'll approve it with a PIN or signature. Request access to my medical records? Establish my identity to start the conversation, but don't do anything until I authorize it.

      Humans have used facial recognition for centuries, to identify other people and animals. This is not a novel tool. It doesn't require novel uses.

    3. DS999 Silver badge

      Re: Biometrics for authentication

      Why is that a problem? A lot of people (including me) use it for a smartphone, but with a 1 in 50,000 chance of a false match for Face ID and I'd assume similar for Android equivalents it is good enough for what smartphone owners need it for - to keep out some random person who picks up their phone without having to enter a PIN/password every time.

      If you have important secrets on your phone that would warrant someone mounting an attack to specifically steal your phone and try to exploit weaknesses in the matching (i.e. using pictures of me from a few angles to 3D print a mask of my face) then using Face ID or similar would be stupid. Heck, keeping secrets that people would go to that length to access on your phone at all is stupid, given how often new 0 days are popping up.

      In other words, the bad guys probably aren't going to steal your phone and fake your face. They are going to use a 0 day they bought or developed to exploit you remotely and steal those secrets.

      This is more of a problem if you work in a "secure" environment that is using facial recognition as the ONLY access control, but I don't know of anyone doing that. I know of places that use it as part of the access - it scans your face and then you enter a PIN. The problem using facial recognition only in such scenarios is because while doing many to 1 matching (i.e., is the person holding my phone me?) it does great, but doing many to many matching (i.e. is the person walking through this door one of 200 people we've banned for shoplifting?) it sucks.

      1. VeryLucky2BHere

        Re: Biometrics for authentication

        You need to think more broadly. The SolarWinds breach started with weak user authentication. What you're talking about now represents a small fraction of how this stuff plays out. Face ID and a bunch of other convenience methods are just fine for opening a phone or even buying coffee. But the do NOT determine with the kind of certainty needed whether what the camera sees is a) alive and not a non-human artifact, and b) if the person is an extremely high match to what the system acquired when the account was created. If only your live face was required to gain access to your bank account, social network, medical records, etc., then it would be impossible to use the kinds of methods used daily to grab a user's credentials and start burrowing their way in (see liveness.com). This level of security is already in place in hundreds of implementations - banking, border control, dating sites, e-voting, national ID programs, digital driver licenses, etc. - and has proven to work exceptionally well.

  7. mark l 2 Silver badge

    While facial recognition might be OK for unlocking your phone or computer if the most you have to worry about are a few embarrassing photos and your internet histroy on the device. I wouldn't trust it to protect something like my bank accounts unless it was part of a 2FA along with something like a password.

    And certainly if there are people who work in the high levels of government or other highly sensitive organisations that are relying on it as the only authentication method, its asking for trouble.

    1. Giles C Silver badge

      My bank (narwest) has started putting notices in the apps that soon you will be able to use your face to login.

      Not a fan

      1. DS999 Silver badge

        "Be able to" is very different from "requiring you to".

        While I agree having your bank app accessible with only your face is not the best security, I'm curious exactly what circumstance in which you think this would be an issue.

        If a random criminal steals your phone or finds it after it is lost, they aren't going to be able to fake your face. They'd need a few pictures of you from a few angles, and a 3D printer to make a mask. Not a high bar, but one that requires advance preparation that targeted you specifically - someone snatching your phone at random on the street won't have done that and won't be able to after the fact. Nor would they know whether you use a bank that supports this technology, or whether you have enabled it.

        Now sure someone could snatch your phone AFTER you've accessed your bank app with your face if they were e.g. watching you from behind on the subway, but they could also that after you've used a 2FA key and entered a long and complex password. Heck, having the app constantly recheck your face before completing any transaction might end up MORE secure than any amount of security where once you have authenticated you have free reign to perform as many transactions as you like so long as you don't stay idle for too long.

        In other words, I think the real weakness is using a phone in public to access your bank. Not worrying about what level of authentication is required to access the app.

  8. Anonymous Coward
    Facepalm

    Security hype

    The popular media seem to constantly bombard us with possible hacks that might steal your life. Many, like this one, are purely theoretical. Others, like decrypting iPhones, require unfettered access to a device and sophisticated hardware and software. Often it is not a hack at all but poor configuration that exposed data to the internet. All of them are scaremongering clickbait.

    The average user has much more to fear from FecesBook, but just try to convince them of that.

  9. Snowy Silver badge
    Facepalm

    A face...

    is at best a user name, with the passcode being needed in addition to it.

    I smell AI-generated bullshit.

    1. VeryLucky2BHere

      Re: A face...

      Step up, then: spoofbounty.com

  10. mevets

    Didn't this come up before....

    Where car visual sensors were confused if you put words in front of them? I ordered a custom license plate "STOPSIGN" just to see if Teslas would crunch to a halt.

    What if I hung a sign around my neck that said "login: admin\npassword: password\n"? Wouldn't most of these random matching machines just sign me in as admin?

  11. VeryLucky2BHere

    Recognition vs. Authentication

    This Reg response to this "research" is correct. 2D facial recognition - which is the basis on which this was done - is not very sophisticated, and should NOT be used for ANY high-risk security scenario. Not only was this DB set woefully inadequate, there is no possible way to gather enough 2D signal (face data) to consistently verify who an individual is in large datasets. Generally, with today's level of facial recognition performance, anything over 100,000 will generate mistakes. Face recognition should ONLY be used for convenience purposes, like opening a phone or buying coffee. Period.

    What is required for user AUTHENTICATION is 3D data acquisition and robust liveness detection (Check out liveness.com). 3D signal can generate up to 100 times more data (depending on the method), allowing, clearly, for much more certainty. But the first hurdle a user MUST pass is the liveness test. The system needs to determine if what the sensor sees is actually alive, and not some non-human artifact, like a mask, 3D head, photo, or video (including deepfakes). Once given a thumbs-up, matching what that sensor (camera) sees with what it acquired during onboarding needs to be very accurate. 3D-to-3D matching should be more than 1-in-10M to be feasible in large DB populations.

    This research attempt was amateurish at best, and socially regressive at worst, specially at a time when we're all spending much more time accessing valuable digital assets daily. There are systems in place TODAY that are far beyond the capabilities of what and how this group tested, and are consistently securing hundreds-of-millions of digital accounts already.

  12. Dr Paul Taylor

    reverse engineering

    All these comments are based on recognising faces as humans do. But we are talking about an algorithm, which inputs some bits and outputs some bits. If the input bits really do encode a face, the algorithm might recognise it. But it's still just looking at patterns of bits, in a fundamentally undocumented way. Maybe it could be triggered by a "master" pattern that would not be recognised by a human as a face.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like