back to article So woke: Microsoft's face-recog can now ID more people who aren't pasty white blokes

Microsoft has improved its facial recognition technology so that it is better at identifying humans who aren't white men. Today's announcement of the breakthrough, which promised "significant improvements in the system's ability to recognize gender across skin tones," comes a week after CEO Satya Nadella sent a missive to …

  1. Anonymous Coward
    Anonymous Coward

    Can't wait for this to become bundled...

    As a critical Windows-10 security update etc. STOP!

    ICE Agents have been dispatched! Wait by your PC!

  2. Mark 85

    So if your a white male intent on doing some crime then a bit of makeup is in order? Something to darken the skin, maybe the eyebrows? IOW, pretty much what many crims have done in the paste then without going full hoodie and ski mask.

    1. phuzz Silver badge

      You don't need to cover your whole face, just a few strategically placed sections to throw the computer off.

      For example.

      Possibly makeup that only showed up under IR, would mean you could look normal to humans, whilst confusing cameras.

    2. Teiwaz

      pretty much what many crims have done in the paste then without going full hoodie and ski mask.

      So you're saying go out and buy a pair of tights instead?

      That's been done before too.

      And you'll look like a tit if you can't tell the difference between tights and stockings...

  3. NanoMeter


    So finally, the software has now been unracisted. That's good.

  4. veti Silver badge

    Any guesses, what "reducing an error rate by up to 20 times" might mean in English?

    1. Anonymous Coward
      Anonymous Coward

      Not as much as they'd like you to think it means.

    2. Richard 12 Silver badge

      I guess that means it only ever worked once before, and this time it succeeded twenty times?

      Only tried a billion photos, but hey, twenty times!

      With such vague results, one can be certain that it's still utterly useless. Probably worse than useless.

      One wonders if its even better than pure chance.

    3. Anonymous Coward
      Anonymous Coward

      It's another example of poor expression which renders a statistical statement meaningless (or deniable).

      It could mean error rates are down to 5% (assuming previous error rate was 100%) or lower.

      But 'up to' just confuses it.

      I started typing this thinking I knew how to express what they meant but I give up. Too many semantic errors in the sentence.

      Just assume the facial recognition tech is better. But don't assume how much by or how much in relation to particular genders/ skin tones.

    4. Spanners Silver badge

      Reducing the rate up to 20 times

      This is often a phraseology type used either by the terminally artful to try and mislead the audience or by clever people trying to sound less so.

      If it made sense, the phrase might mean errors dropping to 0.05 of what they were. Until it is explained, we could assume errors have fallen by 0.05%

    5. katrinab Silver badge

      Google's version couldn't tell the difference between a human with dark skin and a gorilla most of the time, never mind tel the difference between different humans with dark skin.

      Whereas, it was able to tell the difference between a gammon (racist, foaming-at-the-mouth, sunburnt white man) and a pig

    6. Anonymous Coward
      Anonymous Coward

      Bayes' Theorem

      It is quite amazing how many medical researchers and the like do not want to understand it. (An observation made to me years ago by my statistics supervisor, one of whose side jobs was to report on dodgy statistics in medical trials.)

      For everything like this there are two error rates: false positives and false negatives. Often you could reduce the false positives by 95% at the expense of greatly increasing the false negatives*. So which is it?

      *Suppose the population has 2% positives and the test gives 40% positives, but only detects half the real positives due to the uncertainties of the test. Changing the threshold to give 2% positives will now almost certainly give only false positives.

    7. Greencat

      No improvement at all.

    8. Korev Silver badge

      >Any guesses, what "reducing an error rate by up to 20 times" might mean in English?

      They mean a 95% reduction in errors, but trying to make it sound good...

  5. Anonymous Coward
    Anonymous Coward

    Face recognition...and then you add in....

    .....voiceprints......and GCHQ, the NSA, ICE and every other bad actor on the planet has a handle on your privacy. No....not this:


    Welcome to the future!

  6. Locky

    Clippy ICE

    I see you're having a heart attack. Would like some help with that?

  7. poohbear

    Short list

    "technology would not be used for anything Redmond deems unethical."

    Is there actually anything on that list?

  8. Chris G

    "If we are training machine learning systems to mimic decisions made in a biased society, using data generated by that society, then those systems will necessarily reproduce its biases."

    That's good then, only the poor and discriminated will be eliminated by the machines.

    'Goodlife' will be okay.

    'Goodlife' see Fred Saberhagen's Berseker books.

  9. wolfetone Silver badge

    "...CEO Satya Nadella sent a missive to employees assuring them that the technology would not be used for anything Redmond deems unethical.

    That doesn't fill me with confidence, especially when Windows thought Microsoft Bob was a great idea.

  10. Anonymous Coward
    Anonymous Coward

    Lets just everyone

    wear a Burka and see what happens then...

    Huge sombrero's will also make the job of this evil system a whole lot harder.

    MS is aiding Big Brother. As if this comes as a surprise to anyone.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like