back to article NEC insists its face-recog training dataset isn't biased, but refuses to share details of Neoface system with UK court

Facial-recognition technology used by British police forces does not rely on trawling the internet for random face photos to use as training data, an NEC manager told the courts. The statement was referred to in the Court of Appeal last week by South Wales Police's barrister Jason Beer QC and obtained by The Register yesterday …

  1. Gordon 10 Silver badge

    Nuke it from Orbit

    An ML aglo with a non-pubic training dataset several orders of magnitude worse than a piece of closed source code, as with code you (mostly) have to explicitly include bias. (if ethnicity <> 'white' goto stopandsearch). With ML the bias is implicitly generated by problems with the training dataset **as well as **any explicit bias as part of the algo spec.

    If not a legal mandate to expose the training dataset there should be a standardised test dataset used with performance against expected norms documented and signed off prior to production usage. This is what happens when Tech outruns the legislation.

    1. Shadow Systems Silver badge

      Freudian slip?

      "An ML aglo with a non-pubic training dataset..."

      Cheers! =-D

  2. Gordon 10 Silver badge

    Annual update of Algo??

    You can tell its public sector - thats a very poor cadence.

    I wonder if they have included Algo/Model updates in their patching process? Surely if its known to generate biased results it needs to be fixed asap, some biases could be the equivalent of a zero day for the poor schmuck on the receiving end.

    Also whats the penalty going to be applied to the Plod for not swiftly applying the Algo "patches". (rhetorical question. I already know the answer is "nothing").

  3. Yet Another Anonymous coward Silver badge

    Not just racist

    But apparently targets black people with a grid of dots on their face

    1. Anonymous Coward
      Anonymous Coward

      Re: Not just racist

      not as bad as a 100% False Positive rate targeting Brazilian Plumbers with rings of concentric circles on their heads.

  4. Doctor Syntax Silver badge

    "the Court of Appeal judges hearing last week's case seemed pointedly uninterested in wider legal and societal issues raised by the Cardiff AFR deployment."

    Could that be because they expect it to go to the Supreme Court on those issues?

  5. This post has been deleted by its author

  6. katrinab Silver badge

    What these accuracy numbers really mean

    Worth repeating as some people forget:

    Lets say you scan a population of 100,000, and you are looking for 10 people

    With 98% accuracy you will find 9.8 of the people you are looking for. That bit is easy to understand

    You will also get 19,998 false positives. That means that only 0.049% of the people it flags up are people you are looking for.

    With 70% accuracy, you will find 7 of the people you are looking for. Again, no surprises there.

    You will get 29997 false positives, meaning that 0.023% of the people it flats up are people you are looking for.

    If you toss a coin, 0.01% of the people it flags up will be people you are looking for.

    1. Anonymous Cowerd

      Re: What these accuracy numbers really mean

      98% accuracy will give 19998 false positives out of 100000?

      Please check your maths.

  7. A random security guy Bronze badge

    NEC will never let go of its data sets

    Just the nature of NEC; it will never let go of its IP. However biased. It just doesn't make sense for ANY country's legal system to outsource the training data set to a foreign agency. Essentially, the UK government just abdicated its responsibility to protect its citizen's privacy and legal rights to a foreign entity.

  8. HildyJ Silver badge


    Coincidently, on this side of the pond, "Detroit's police chief admitted on Monday that facial recognition technology used by the department misidentifies suspects about 96 percent of the time." (Ars Technica)

    Facial Recognition is, always has been, and always will be fraught with error.

    Besides which, Facial Recognition is, always has been, and always will be an invasion of privacy.

    1. TDog

      Re: Error

      But the good news is that with a 96% failure rate it is only a 4% chance of an invasion of YOUR privacy<g>.

      1. Jonathon Green

        Re: Error

        They’ll probably shoot you anyway just to be on the safe side...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020