If we can't get bias out of society, how do we get it out of algorithms?
It could be argued that the reason many people would refer to an unknown computer programmer as 'he' or assume a gang member arrested in a low income neighborhood is black, is the same thing. Our biases are a product of our exposure to information, so unless algorithms can be given a source of information other than the real world, I'm not sure how easy this will be to solve.
Rather than trying to feed an algorithm PC-approved cleansed data, maybe it should be taught about bias. Oh wait, that would require these "AIs" to actually have some artificial intelligence, instead of just being large relational databases with clever input methodology.