Meaning of Bias?
The word "Bias" seems to mean more than one thing within this article. There is the desire to ensure that facial recognition systems perform equally well on non-white faces, a problem which can presumably be solved by focusing training on such under represented faces, as the article suggests.
The second, more interesting usage is bias in terms of subjective qualities, for example the neural network that measures women's beauty. A subjective quality like beauty is inherently "biased" - being in the eye of the beholder, and all that. However that does not mean that meaningful statistical predictions cannot be made. The insurance industry is built upon such statistical modelling, after all. An insurance company will be "biased" against an 18 year old man from a poor estate with a turbo charged car, when he attempts to get car insurance - even if he is a safe driver. So sometimes bias is accepted, but when it comes to a machine judging women's looks, it is considered a problem.
I'm not sure why this is, my only thought is that it's not the application itself, but the perceived reason behind it getting written, (geeks getting uppity, judging higher status women with their technological wizardry, doubtless cackling and rubbing their hands as they do so). I think the collision between big data +AI prediction and modelling, and "right thinking" people's beliefs is going to be highly popcorn worthy.