Smoking hairy golfball
This article from 1981.
The mad dash to secure and deploy AI infrastructure is forcing datacenter operators to reevaluate the way they build and run their facilities. In your typical datacenter, cold air is pulled through a rack full of compute, networking, and storage systems. At the back, the heated air is then captured and ejected by the facility' …
Charging a car at 56kW is a short-term activity (1hr tops), while a 15kW GPU in a DC will be running 24×7. So that 15kW GPU will use more electricity than the car on a daily basis, and of course, it's not just *one* 15kW GPU - it's an entire DC full of the things.
Even worse, a lot of this so-called "AI" computing appears to be just a huge waste of time and energy with no discernable useful output - unlike the car, which at least serves an obvious purpose in getting people from A->B (although that activity might be unnecessary).
Charging a car at 56kW is a short-term activity (1hr tops), while a 15kW GPU in a DC will be running 24×7
When looked at in isolation, that is correct. Having millions of people charging their electric cars at different times would make your assertion incorrect as you would have formed a base loading rate equivalent to N number of cars charging 24x7.
it's not just *one* 15kW GPU - it's an entire DC full of the things.
Yes, just as it won't be just one electric car.
Even worse, a lot of this so-called "AI" computing appears to be just a huge waste of time and energy with no discernable useful output - unlike the car, which at least serves an obvious purpose in getting people from A->B (although that activity might be unnecessary).
Apples and oranges. Saying it's a huge waste is a very ignorant comment. I am using AI daily and it has been the most useful tool I have encountered in my life.
There are a lot more computers¹ than there are vehicles, and computing power consumption is routinely close to 24×7 while vehicle power consumption isn't (possibly equivalent to 8×7), so even if every vehicle on the planet went electric, I don't think total vehicle power consumption would ever reach that of computing.
Note that I didn't say *all* "AI" is a waste. But the likes of ChatGPT aren't genuinely doing anything constructive, are they?
¹ I suspect there's probably more computers than just about any other single electrical/electronic device there is, other than smartphones (though they are computers really).
yikes...
This Canadian man spent over $130,000 on an electric vehicle to be a “responsible citizen...During a disastrous family road trip to Chicago, Bala had to have the truck towed, then rented a gas-powered vehicle to complete the trip.
Bala called EVs the “biggest scam of modern times.”
First, between the truck, a charging station for his home, a charging station for work, and an updated electric panel for his home, Bala spent $130,000 to go “green.”..
During the 1,400-mile family road trip to Chicago, Bala and his family were beset with all sorts of problems.
Fox Business reported that “fast charging stations — which only charge EV’s up to 90% — cost more than gas for the same mileage. On the family’s first stop in Fargo, North Dakota, it took two hours and $56 to charge his vehicle from 10% to 90%. The charge was good for another 215 miles.”
Two hours and $56 to charge a vehicle to go 215 miles do not make for a fun road trip.
It was only downhill from there.
Multiple charging stations had malfunctions as the vehicle only had 12 miles of charge left.
“This sheer helplessness was mind-boggling,” Bala explained. “My kids and wife were really worried and stressed at this point. By now it was late afternoon. We were really stuck, hungry, and heartbroken.”
That’s when Bala had the vehicle towed to a Ford dealership and rented a gas-powered car...These are the stories that the “green” activists refuse to share. https://tinyurl.com/4dsyetb7
A car is larger than a GPU, in case you haven't noticed.
Also, charging a battery does not dissipate the power, it stores it (well the vast majority of it) so there is only a few hundred watts of dissipation, while the GPU dissipates all of that power and turns it into heat.
Is this another Musk-level brain fart ?
It's a datacenter. It uses ungodly amounts of power, generates equally ungodly amounts of heat, and that heat needs evacuating. Oh, and on the side, it may be somewhat useful.
It's not because you're building it with the latest in-house doo-dad that the situation changes.
Call me when you've invented a datacenter that doesn't need cooling. THAT will be first-of-its-kind.
Comparing the power consumption of multiple DCs crammed full of GPUs drawing 15kW or so apiece with the power required to charge electric cars is not what this is about. Yes, energy consumption is an issue, but history shows that over time increases in density are offset by improvements in energy efficiency. ENIAC drew 175kW in 1945 and was royally outperformed by the Sinclair ZX80 35 years later.
Right now the issue is DC thermal management. Cooling requirements are getting to the point where they struggle with the amount of heat they need to shift away from the computational hardware and are close to becoming more expensive than said hardware. On the other hand, capturing the waste heat and recovering the energy from it is becoming a more and more feasible prospect.
At some point there will be legal mandates for energy recovery for Data Centres. I'm aware that there are already schemes where this piped out to the local municipality to heat local home's.
However building facilities that are more efficient in recycling the energy is going to be a challenge in both terms of design and cost. This would indicate the need for more liquid cooling to be able to capture the exhaust heat. What's not going to decrease is our appitite for compute power.