So they are suggesting 1 rack of servers uses far more than 1kW of power just for the fans? So much more that the energy cost of the liquid cooling apparati are covered by what is excess of 1kW?
Iceotope: No need to switch servers to swap air-cooled for liquid-cooled
Liquid cooling specialist Iceotope claims its latest system allows customers to easily convert existing air-cooled servers to use its liquid cooling with just a few minor modifications. Iceotope’s Ku:l Data Center chassis-level cooling technology has been developed in partnership with Intel and HPE, the company said, when it …
COMMENTS
-
-
Thursday 30th June 2022 20:12 GMT Anonymous Coward
This is just the IT electricity savings. The cost at facility level can become huge because you don't require air conditioning any more. You can take 40C water to the IT racks, and cool the water using a basic radiator outside in high ambient temperatures outside. Unlike air cooled servers, data centers don't need to condition the air to the facility (e.g adiabatic coolers) so use zero water (WUE = 0).
-
-
Thursday 30th June 2022 17:03 GMT Anonymous Coward
Not the fans, the cooling
In a closed environment the air cooled systems were ~5% less efficient, so by going with the alternate cooling method the savings for a full rack came to 1kW vs the alternative. It ain't a lot but it ain't nothing neither, and these things stack up across a big data center. Not bad for a drop in mod for a case that is not optimized for liquid cooling. Better to design an optimized chassis that uses the same guts though. Still, it makes sense to get their foot in the door.
Air cooled computer systems are notoriously prone to leaking exhaust air out all of their cracks and inhaling their own hat exhaust, which kills the thermal efficiency.
Better to use liquid cooling and reject that heat into a closed system where you can dump it outside the cooled zone. The laws of thermodynamics make re-cooling hot exhaust air a suckers game.
-
-
Thursday 30th June 2022 20:12 GMT Anonymous Coward
Re: And how much extra effort
Zero extra effort to replace a DIMM, CPU, GPU, or otherwise. This is not the same as most commonly known tank immersion cooling where servers require a lift to remove the chassis and a trolley to service. In this model, you just pull the server out on rails, and remove the lid to access the server. It's exactly the same as air cooled servers. I am talking from experience using this system.
-
Thursday 30th June 2022 22:46 GMT Anonymous Coward
Re: And how much extra effort
I haven't swapped out anything in a server for years, only front or rear swappable drives. You must be doing something wrong at purchase or install time. Some of the servers have been in the same rack for 7/8 years.
Also check server spec sheets or ashrae, systems have been 40 degree capable for years, DC's haven't needed to be so cool for a long time.
-
-
-
Thursday 30th June 2022 21:08 GMT Anonymous Coward
Re: Water and Electricity in a Datacenter
Do you remember that episode of tomorrows world where they shoved a CRT TV into a tank of a liquid, and it still worked.
It was some inert liquid that was said would revolutionise the cooling industry. I wonder what happened to that?
Googling, I find: https://www.3m.co.uk/3M/en_GB/novec-uk/applications/immersion-cooling/
and a recent video demostration: https://www.youtube.com/watch?v=YyKIZPuepl8
-
Thursday 30th June 2022 21:41 GMT martinusher
Re: Water and Electricity in a Datacenter
Di-ioinzed (pure) water will work fine. We used it in the early 1980s for cooling high power transmitter amplifiers. The devices were 'hot' -- at around 10kV -- so the water was required to both cool and insulate.
I'd guess any magic would be to systematically decontaminate the coolant as it circulated and to use a two stage heat exchanger that would keep the volume of primary coolant relatively small.
The advantage of liquid cooling is that you won't get hot spots so the equipment is likely to be a lot more reliable than traditional air cooling. No fan noise is a bonus.
-
-
-
-
Friday 1st July 2022 05:27 GMT Pascal Monett
You can deactivate that in the BIOS. Of course, you need to be sure your cooling system works.
I did liquid cooling way back when I had my first AMD Athlon XP 1600+. I bought an aquarium pump (because silence), the special CPU connector, a humongous radiator and the tubing and miscellaneous connectors that were necessary.
I set it all up, turned the pump on, turned the PC on, and got the No Fan warning. Hunting around in the BIOS, I found that you could disable that warning. Restarted the PC and, from that point on, on got the most silent computing experience of my life, and all the performance as a bonus.
Liquid cooling today is widespread. It's on motherboards by default (though not for DIMMs), graphics cards all use it (see here) and liquid cooling modules for all versions of CPUs are commonplace.
It's a bit more noisy than it used to be, but it's still better than having an air-cooled system.
-
Wednesday 27th July 2022 19:16 GMT Anonymous Coward
Hi,
The system pumps fresh cool dielectric to each component on the server as required, in a precise, engineered way to keep the component within design specification. Each server is individually flashed with new firmware to remove errors related to fan RPM. Worth also mentioning that the pumps used also provide similar data to fans, RPM etc. The health of the system is measured very precisely and can all be monitored remotely. I am talking from experience designing and using the system.
-