""The idea that correlation is sufficient, that it gives you some kind of meaningful causal structure, is not true.""
It's also not relevant, for two reasons.
First, and most importantly, we don't have any meaningful causal structure that explains how WE are conscious and intelligent (and by that "we" I mean the 0.000001% of humanity I have any reason to consider conscious and intelligent). Nobody, not any of these philosophers or neuroscientists or "experts" can tell me HOW my own brain produces what I perceive as my own consciousness, so "causal structure" is absent for humans. Just saying "brains done it" isn't an answer.
Second, I don't need a causal structure or any understanding of WHY something works to build something that DOES work. The distinction has been pointed out between science and engineering - and AGI doesn't need to be the product of science. Yes, boffins, by all means sit around for the next 200 years discussing why brains do what they do and what it all means. But don't expect the engineers to wait for you to come to some sort of conclusion before they build a machine that behaves to all intents and purposes with human-level intelligence, because they don't need to.
Chess is the perfect example: we still don't really know IN DETAIL the mechanics of how Magnus Carlsen analyses a chess game. Sure you can spout platitudes about pattern recognition and so on, but how does his actual brain do that? We don't have any idea. And nobody NEEDS any idea to build a machine that can beat him. Rinse and repeat for Go, and the bar exam. The machines are getting smarter faster than the world is getting more complicated.
Most importantly, while there's no accepted definition for what NATURAL general intelligence really is, that doesn't matter either. All people care about is "does this tool work?". Right now, the tools don't work in a whole host of applications. But they're only going one way.