The source data isn't necessarily biased or wrong, when it is information, for example, talking about Dr. John Smit the Surgeon and refers to him as he, it isn't biased.
It is the selection of data that Google uses, its dataset that has the bias.
That said, given the apocalytic errors it can make with translations, I would think that this gender bias is a very minor point.
A few years back, Google Translate would ignore the word "not" when translating from English to German! So "do not open the case" would be translated to "das Gehäuse öffnen" (open the case). For example:
"Do not open the case, high voltage inside" ´= "Gehäuse öffnen, Starkstrom drinnen" (Open the case, high voltage inside)
"Do not open the case, no user servicable parts inside" = "Gehäuse öffnen, nichts drinnen" (Open the case, nothing inside)
I actually put the correct translations into the feedback and it has improved since, but I would think such errors should have priority, as they can be downright dangerous.