What are they doing with it?
Matter is neither created nor destroyed.
So what are these datacentres doing with all the water they are "using"?
Concerns over the environmental impact of datacenters in the US state of Virginia are being raised again amid claims their water consumption has stepped up by almost two-thirds since 2019, and AI could make it worse. Virginia is described as the datacenter capital of the world, particularly Northern Virginia – as reported by …
Yes, the water cycle still exists, but depending on where the water comes from, the aquifer might not be able to sustain that kind of usage. In many places the aquifer is not refilling as quickly as it is being depleted.
Plus they're wasting treatment chemicals and the energy to purify drinking water that then goes to anything other than drinking.
Solutions to this would include using a closed-loop cooler, switching to so-called "grey water" systems (water that is filtered but not potable), and/or using the heated "waste" water to do something else after it's cooled all those chips so it is at least used twice before discharge.
<blockqutoe>So what are these datacentres doing with all the water they are "using"?</blockqutoe>
My guess is that they don't run the warm water through a radiator or cooling tower to cool it off and then recirculate it. IOW, they only use water as an occasional backup cooling mechanism and it comes out of the tap, through the computers or heat-exchangers, and then down the drain.
Half the plumbing, half the installation expense! Water's free, right?
Algae - like all plants - respire at night removing oxygen from the water. If this is combined with high water temperatures it can mean that by dawn oxygen levels in in the water reach low enough levels to kill fish and anything else that relies on it. If the warm water causes algal blooms then in addition to the reduction of oxygen the resulting loss of sunlight can kill other plants and animals. If there is a large amount of algae then when they die off they also consume oxygen as they rot, exacerbating the above situations.
TLDR - sudden increases in algae in rivers and lakes can fuck them up bigly.
Algal bloom deoxygenation is actually rather interesting and complex. It's less to do with them not being a net contributor of dissolved oxygen and more to do with their subsequent decomposition, as well as them depriving more efficient oxygenators of sunlight.
They use it to operate chillers, which evaporate some of the water in cooling waters to expel heat - that evaporated water needs to be replaced.
Its just unfortunate that they have no alternative (due to the lack of appropriating pipe infrastructure in cities except in places like Arizona where water is truly scarce) to provide anything other than fully treated water. They don't need water all that pure, many chillers can operate on grey water or untreated river water. So even in places where drought is almost never a concern enough datacenters in one place can be a problem simply because of the additional load it places on your treatment plant to treat and supply more water.
The water use mentioned corresponds to 5677 acre feet of water per year. (Acre feet is a commonly used agricultural term in the US.)
The total amount of water used in irrigation in the US each year is about 90 billion (thousand million) acre feet. Not much of that is in Virginia, though.
Maybe datacenter water usage is a serious local issue.
Normal water cooling doesn't use much water at all; you just use it to transfer the heat away from the hot bits efficiently to somewhere you can easily cool it, like radiator.
Do datacentres not do that? Are they just piping it straight back out again, because it's easier and cheaper? Or perhaps they *do* in fact reuse it, but the increase in datacentre water cooling just means more use in general.
More to come - power densities are going up and up, and water cooling seems to be the only way to deal with it - c.f. Nvidia Blackwell racks that use a mind-boggling 120kW.
https://www.theregister.com/2024/03/21/nvidia_dgx_gb200_nvk72/
Do datacentres not do that? Are they just piping it straight back out again, because it's easier and cheaper? Or perhaps they *do* in fact reuse it, but the increase in datacentre water cooling just means more use in general.
Sounds like it, or evaporative cooling that leads to loss. So multiple possibilities. If water's being dumped down the drains, then how that could be recycled. DCs will be charged for water used, and may be charged for waste water. Or whether DC's should build cooling ponds and recirculate that water, but that needs land for those ponds. But for reasons, seems that evaporating is cheaper than using closed-loop cooling systems. If those were used, less water would be needed.
Stating a percentage of power used by data centers is a useless figure without further information. Sure data centers consume a lot of power and utilize a lot of water, and we always see scary data about that. I question how much is offset by reducing usage elsewhere, making it a zero-sum game. We never see that data or analysis.
How much power is NOT being used when...
- A local on-prem data center is migrated to a mega cloud-data center?
- Could the case be made that the economies of scale make the mega cloud-DC more efficient than individual on-prem data centers?
- Data centers are the evolution of manual human hands-on processes. How much more efficient are DCs versus humans at desks (using floor space, air-conditioning, flushing toilets, utilities, etc)?
Besides, what else are we going to do? Stop, no more data centers, we are done. Societal progress is now halted. No more new apps on your phone, no more instant communications. No more recommendations on the best sushi restaurant near me.....
Stating a percentage of power used by data centers is a useless figure without further information. Sure data centers consume a lot of power and utilize a lot of water, and we always see scary data about that. I question how much is offset by reducing usage elsewhere, making it a zero-sum game. We never see that data or analysis.
I think it'd be a very complicated analysis. Plus attempting to normalise elements. So how much energy & water is being used for say, crypto mining vs say, gold mining, especially when gold has far more utility that just it's raw monetary value. Or there's the AI tulip. How much of that is useful work, and how much is just converting energy & water into hot air.
It may be possible to compare costs of an AI chat bot vs a human, but harder to calculate for AI replacing human. So a business may need 1,000 chatbots man.. personing support lines because their product sucks and needs a lot of support. But if AI can be used to develop a better product, it won't need as many bots or humans. If it can optimise production, that may mean fewer resources needed to make stuff, and less waste.