It didn't decide to do anything. The input photo is all the data collected about you. The output photo might be a single pixel describing your credit rating. And the filter is the entirety of the program.
One of us isn't getting this, and I don't think it's me!
To derive a description of my credit rating from all the data about me, the program/filter/macro/neural net/AI must have followed a finite number of steps of sequence, selection and iteration.
All I'm asking is why people think that that cannot be logged and output - ie why the AI cannot explain how it arrived at an outcome.