Re: Amazon’s sexist AI
>>its just throwing out matches that reflect society as it really is. So if one considers this is a problem, then the need is to change society, not the AI.
We're working on it.
As for politically correct results, if you have concrete, objective metrics, that should assuage HR.
If you *don't* then how do you do a performance review?
It is politically incorrect to imply a certain race/gender are inherently better or worse at something. To paraphrase a study relating to gender "The differences *within* a group of men/women are bigger than the differences *between* the groups"
Id wan't to be judged by how likely I could do the job well, rather than a droid (either AI or HR) deciding that me not using "executed" on a CV (because I'm an adult) has an impact on getting an interview.
The thing about anonymised applications is that you don't shouldn't know if you're discounting a particular demographic. Have a look at the GDS application criteria, they flat out tell you to remove references to sex, sexuality, age or religion from the "Tell us about your skills" section of the application. (although some of that could be inferred)
Also, if you're telling the AI to select people similar to employee A (A hypothetical straight white cisgendered male), then you're going to get selectees from the same demographic.
As other commenters have said - GIGO