"OK, everyone can call me naive and shoot me down in flames, but is there any reason why an AI CAN'T tell you why it has made a particular decision?"
In every situation, it can. BUT that does not mean you would understand it. Supposing I have some complex mathematical relationship with some changing natural phenomenon. By the time you get around to asking "but why?", the natural phenomenon that formed part of the input to the decision is gone, never to be reproduced, and the mathematical equation from which we derived the results might take you 4 years of college level mathematics to understand.
YES. AI can tell you why it made the decision. But you will not understand , nor will you be able to check, the answer.