TLDR; Bikini pic promised...
... no bikini pic :-(
The standards in internet journalism have fallen... I guess it's back to 4chan
Today's artificial intelligence can autocomplete a photo of someone's face, generating what the software predicts is the rest of their body. As an academic paper pointed out, though, these neural networks are biased, presumably from their training data. That means when you show this code a woman's face, it's likely to …
> Although controversial HR startup HireVue canned a facial analysis feature in its software that assesses the potential performance of job candidates
Who the feck thought that was a good idea, even as a gimmicky unique-selling-point? It's basically digital phrenology.
The thing with facial recognition (and by extension, analysis) is: it's an amazingly clever tool, and one that we as a species seem intent on proving we're not nearly responsible enough to actually wield.
Mind you, I always thought the habit of putting your mugshot on your CV was weird as hell anyway.
I was reduced to a giggling fit at one place I worked, when a users outlook profile flashed up his face. Which after I had recovered sufficiently, I then had to explain to Canadian colleagues.
The user in question had decided his profile pic should be a very mullet haired Pat Sharp!
Personally, I think that the example highlights the kind of thing that is possible and that will be done. The history of the depiction of women is long and complicated but, while there is no doubt that much of is sexualised (both nakedness and lack of it), there's also the observation that women do pay more attention to how they and other women look, in all kinds of situations, than men do. Nature vs. nurture, of course, but the correlation is long-standing. The aesthetic of the look is one description I've come across, where it relates closely with what people are thinking about each other – the kind of complicated thinking that my little brain doesn't seem to engage in very often – but that's by the by. Fan fiction is one of the areas most studied in this respect.
That means when you show this code a woman's face, it's likely to autocomplete her in a bikini or other revealing clothes. White people tend to be shown holding tools while Black people are pictured holding weapons.
where i live this is quite normal, and southern usa
Are black people not more likely to be from a working class background, and therefore holding tools?
And white people more likely to be working for the police / military, and therefore holding guns?
Note, this is a statistical observation of the impacts of structural racism, not a suggestion that certain people are more or less suitable for these positions.