Nuke it from Orbit
An ML aglo with a non-pubic training dataset several orders of magnitude worse than a piece of closed source code, as with code you (mostly) have to explicitly include bias. (if ethnicity <> 'white' goto stopandsearch). With ML the bias is implicitly generated by problems with the training dataset **as well as **any explicit bias as part of the algo spec.
If not a legal mandate to expose the training dataset there should be a standardised test dataset used with performance against expected norms documented and signed off prior to production usage. This is what happens when Tech outruns the legislation.