To be fair, the article does not really do a good job of focusing on the unwanted part of the bias issue.
Obviously the system is doing its job really well: it detects who is reacting to the ad and serves the ad to them. The fact that it's completely impossible to make it do what you actually want when your goals are not just "maximizing profit" may be a problem, true, but that's a problem with every modern system. Try to browse for a specific movie on Netflix, search for a specific spelling of a word on Google, filter Amazon search results for a specific product, see the most recent item in your Facebook timeline...none of these things are really possible any more because the algorithms are designed to steer you towards buying product, and have no other purpose.
This is now the norm. Facebook's ad system is highly optimized to do one thing and it does it well. The fact that this optimization also inflexibly reinforces existing biases is a problem, but solving that problem will never be a goal. You can force them to add a second mode that bypasses the optimizatino, but the prejudice-reinforcing algorithm will always be used for normal operation, because there's more money, and to a corporation there is no greater good.