... is not always possible, even with the best intentions.
... with simple statistics, only the STEMmers will understand
... with complex statistics, only statisticians will understand
... with AI, nobody will understand.
Someone can tell you the architecture of their AI, and all the weights of the trained network, but it doesn't tell you why it makes any particular decision. Perhaps we have to wait until AI is conscious enough to explain itself. I'm not hopeful, though: as my late father used to say, 95% of human rationality is used for providing convincing explanations for decisions they have already made on gut feel.