Reply to post: What neural networks think ...

Silly Google's Photos app labelled black people as gorillas

John H Woods Silver badge

What neural networks think ...

Check out this Google Research Blog where they get some insight into what ANNs (artificial neural networks or "AI") have actually learned by feeding them random noise or images of clouds and (simplifying here) "asking them" to identify buildings or animals.

The identification is, without the G-word, one of dark skinned higher apes and, on a naive level, this is not really a failure: the gorillas, the chimpanzees and the bonobos are our closest living relatives. And by close, I mean really close, on a deep genetic level. The connotations of the word are terrible, but that is because of centuries of human racism, not because ANNs (or Google) are "racist". The reason white people aren't identified as such is because we are the mutants who lost our ability to produce large amounts of melanin, resulting in a very obvious visual difference: one which, to ANNs, can appear much more significant than it really is. In fact, it just means we can tolerate cooler climes somewhat better and intense sunlight a hell of a lot worse. They'll have pulled the ANN now, but I'll bet that a 'negative' of a group of white people would also have produced the same result.

Where other visual indicators are more significant, the ANN picks that. Note that, despite the subject not being white, the last picture in the tweet is correctly identified as being one of a graduation.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon