Cart before Horse?
It seems the complaint is that it uses 'sexist' clues in the names to try to predict gender. But, assuming its a proper learning tool, that's ass backward. It would have been trained on a whole set of names and associated gender and if it picked cues like 'nurse' as being 'female' its not because its somehow biased, its what the information it was trained on told it.
Often times our claims of bias are just wishful thinking. We want more status so we're hesitant to be associated with what we think are -- what our prejudices tell us are -- low caste jobs. The machine is just highlighting our bias.