Reply to post: Bad data makes a bad model

Google learns to smile, because AI's bad at it

MonkeyCee

Bad data makes a bad model

Argh! If your training data is biased, then you model will be biased.

If you have a heavily biased dataset, and a less biased one, you don't build your model from biased one and then use the less biased one to teach it to correct. You just use the less biased one....

What seems to happen is that the carefully devolved (and already sold and in use by the LEOs) simply doesn't work on the less biased data. Lots of hilarious examples abound, like matching gorillas to black people, and models that have such reliance on skin tone that being too white, black, orange or too much Instagram filter render the matching pointless.

The much less funny aspect is that LEOs are already using this technology, then lying about it. It's been termed "evidence laundering", using a new (and legally untested) technology to make a match, then claiming it was done with a traditional (and accepted) method. In this case, facial recognition is claimed to in fact been a trawl through mugshots.

I personally don't object to this being used as an investigative tool to try and identify an unknown suspect, but what worries me is that it's being used as the sole identifying evidence for convictions. An undercover LEO snaps a couple of cellphone pics, software says it's you, bish bash bosh 8 years for drug dealing.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2022