Re: Reconsider?
What you propose would be the most accurate way to reproduce how the human brain works. It's also horrendously inefficient. It's even worse then you calculate, it would simulate 1 second of the working of 100% of the brain while taking 40 minutes. So power needed to simulate it in real time would go up another factor of 2400 (2400 seconds in 40 minutes).
As said before, simulating (not even physical simulating, just simulating running the code in RTL or Verilog or similar) a modern day CPU on a supercomputer is horrendously slow and power hungry and you get seconds or so per hour work done here too (very rough estimate of me).
https://aiimpacts.org/brain-performance-in-flops/ gives some good insights. For equivalent calculation power (and it IMO tries to take into account the overhead for distributing the result calculated by artificial synapses through artificial neurons) there is still a fairly rough estimation:
"Drexler 2018
Drexler looks at multiple comparisons between narrow AI tasks and neural tasks, and finds that they suggest the ‘basic functional capacity’ of the human brain is less than one petaFLOPS (1015).3
Conversion from brain performance in TEPS
Among a small number of computers we compared4, FLOPS and TEPS seem to vary proportionally, at a rate of around 1.7 GTEPS/TFLOP. We also estimate that the human brain performs around 0.18 – 6.4 * 10^14 TEPS. Thus if the FLOPS:TEPS ratio in brains is similar to that in computers, a brain would perform around 0.9 – 33.7 * 10^16 FLOPS.5 We have not investigated how similar this ratio is likely to be."
As said in the post above, the 70000$ (with excessive profit margin for Nvidia due to lack of competition) GB200 has 4* 10^16 FP4 FLops. That's bang in the middle of the estimated 0.9 – 33.7 * 10^16 FLOPS and at around 1 kW. Likely the architecture of a GPU / AI accelerator would require additional overhead to get "brain intelligence functionality", but that's way below nuclear power plant levels. Actually modern day electric cars could power it for days and provide it mobility and concealment. Few would suspect it's a rogue AI sneaking away.
Luckily scientists and mega-corps and whoever haven't figured out yet how to convert all that raw calculation power into actual human like intelligence, but stating that the needed power and infrastructure would be so huge it would not realistically be feasible and that it would be very easy to locate and shut down due to its massive need for equipment and power... is a bit too optimistic IMO.
My estimate is that if we knew the programming and coding needed to run it today, we could get AGI on today's hardware and it would be surprisingly frugal in terms of hardware and power.