This is why we have CS courses
This sort of response was included as part of CompSci courses back in the 1980s, for bleep's sake: a key part of any learning mechanism is that you have no idea what/how it is going to learn[1], you just keep your fingers crossed that it will actually manage to learn *something*. Especially back in the '80s when machine resources were lower than today and it was painfully obvious that the system had gone down an inefficient route and couldn't achieve any of your goals for it, scrub and try again; nowadays just fling more cycles and memory at it.
Genetic algorithms will happily recreate the Vagus nerve's ridiculous looping down and back up again. Any signal into a system may be given "unexpected" importance, either higher or lower than *you* expected (because you think you are looking at the forest but the model can't even see the trees, just the leaves): a system that is "punished" will be trying to optimise away the punishment and just ignoring that input is a perfectly good way to do it.
The reported stories about AI programs used to be specifically about these weird results, such as the analogue circuits that "ought not to work" because the program had optimised some weird arrangement of parts that took advantage of an oscillation that humans worked to get rid off.[2]
In other words, do the bleeping background reading before trying to build a system![3]
In other other words, blasted whippersnappers, get off the hole where my lawn used to be!
[1] Or you would just, you know, program it directly.
[2] And then how the humans' approach was the useful one, as it allows building blocks to be created and assembled into bigger systems but the "clever" design approach had all the parts interacting together and couldn't scale up.
[3] Wasn't there a time when scholarship in the military was a thing of pride? The design of, success and failure of, everything from strategy to tactics to ordnance? From the importance of land surveys to waterproof boots?