back to article Iceotope: No need to switch servers to swap air-cooled for liquid-cooled

Liquid cooling specialist Iceotope claims its latest system allows customers to easily convert existing air-cooled servers to use its liquid cooling with just a few minor modifications. Iceotope’s Ku:l Data Center chassis-level cooling technology has been developed in partnership with Intel and HPE, the company said, when it …

  1. Brian 3

    So they are suggesting 1 rack of servers uses far more than 1kW of power just for the fans? So much more that the energy cost of the liquid cooling apparati are covered by what is excess of 1kW?

    1. Anonymous Coward
      Anonymous Coward

      This is just the IT electricity savings. The cost at facility level can become huge because you don't require air conditioning any more. You can take 40C water to the IT racks, and cool the water using a basic radiator outside in high ambient temperatures outside. Unlike air cooled servers, data centers don't need to condition the air to the facility (e.g adiabatic coolers) so use zero water (WUE = 0).

  2. Anonymous Coward
    Anonymous Coward

    Not the fans, the cooling

    In a closed environment the air cooled systems were ~5% less efficient, so by going with the alternate cooling method the savings for a full rack came to 1kW vs the alternative. It ain't a lot but it ain't nothing neither, and these things stack up across a big data center. Not bad for a drop in mod for a case that is not optimized for liquid cooling. Better to design an optimized chassis that uses the same guts though. Still, it makes sense to get their foot in the door.

    Air cooled computer systems are notoriously prone to leaking exhaust air out all of their cracks and inhaling their own hat exhaust, which kills the thermal efficiency.

    Better to use liquid cooling and reject that heat into a closed system where you can dump it outside the cooled zone. The laws of thermodynamics make re-cooling hot exhaust air a suckers game.

  3. DS999 Silver badge

    And how much extra effort

    Is required for normal maintenance like replacing a bad DIMM? That's the gotcha with immersion cooling.

    1. Anonymous Coward
      Anonymous Coward

      Re: And how much extra effort

      Zero extra effort to replace a DIMM, CPU, GPU, or otherwise. This is not the same as most commonly known tank immersion cooling where servers require a lift to remove the chassis and a trolley to service. In this model, you just pull the server out on rails, and remove the lid to access the server. It's exactly the same as air cooled servers. I am talking from experience using this system.

      1. DS999 Silver badge

        Re: And how much extra effort

        Is there are video showing this sort of thing somewhere?

    2. Anonymous Coward
      Anonymous Coward

      Re: And how much extra effort

      I haven't swapped out anything in a server for years, only front or rear swappable drives. You must be doing something wrong at purchase or install time. Some of the servers have been in the same rack for 7/8 years.

      Also check server spec sheets or ashrae, systems have been 40 degree capable for years, DC's haven't needed to be so cool for a long time.

      1. DS999 Silver badge

        Re: And how much extra effort

        Either you are super lucky or you don't track corrected ECC errors or if you do don't believe those are a reason to replace a DIMM.

        1. TheWeetabix

          Re: And how much extra effort

          @_@ do you have a lot of ECC errors on a regular basis?

          1. DS999 Silver badge

            Re: And how much extra effort

            If you have enough servers with enough RAM in them you always have ECC errors.

      2. Arbuthnot the Magnificent

        Re: And how much extra effort

        You don't work in HPC then, I'm forever dealing with failed memory and mainboards.

  4. Clausewitz4.0 Bronze badge
    Devil

    Water and Electricity in a Datacenter

    Call me old school, but I am not a fan of mixing water, electricity and datacenter in the same sentence/room.

    1. Anonymous Coward
      Anonymous Coward

      Re: Water and Electricity in a Datacenter

      Do you remember that episode of tomorrows world where they shoved a CRT TV into a tank of a liquid, and it still worked.

      It was some inert liquid that was said would revolutionise the cooling industry. I wonder what happened to that?

      Googling, I find: https://www.3m.co.uk/3M/en_GB/novec-uk/applications/immersion-cooling/

      and a recent video demostration: https://www.youtube.com/watch?v=YyKIZPuepl8

      1. martinusher Silver badge

        Re: Water and Electricity in a Datacenter

        Di-ioinzed (pure) water will work fine. We used it in the early 1980s for cooling high power transmitter amplifiers. The devices were 'hot' -- at around 10kV -- so the water was required to both cool and insulate.

        I'd guess any magic would be to systematically decontaminate the coolant as it circulated and to use a two stage heat exchanger that would keep the volume of primary coolant relatively small.

        The advantage of liquid cooling is that you won't get hot spots so the equipment is likely to be a lot more reliable than traditional air cooling. No fan noise is a bonus.

  5. Ribfeast

    I'm guessing if it simply replaces the fans, and mounts onto the CPU itself, wouldn't everything else in the case still overheat? RAID controllers, RAM, drives etc.

    Plus wouldn't the system throw up faults as it is seeing 0 RPM from the fans?

    1. Pascal Monett Silver badge

      You can deactivate that in the BIOS. Of course, you need to be sure your cooling system works.

      I did liquid cooling way back when I had my first AMD Athlon XP 1600+. I bought an aquarium pump (because silence), the special CPU connector, a humongous radiator and the tubing and miscellaneous connectors that were necessary.

      I set it all up, turned the pump on, turned the PC on, and got the No Fan warning. Hunting around in the BIOS, I found that you could disable that warning. Restarted the PC and, from that point on, on got the most silent computing experience of my life, and all the performance as a bonus.

      Liquid cooling today is widespread. It's on motherboards by default (though not for DIMMs), graphics cards all use it (see here) and liquid cooling modules for all versions of CPUs are commonplace.

      It's a bit more noisy than it used to be, but it's still better than having an air-cooled system.

    2. Anonymous Coward
      Anonymous Coward

      Hi,

      The system pumps fresh cool dielectric to each component on the server as required, in a precise, engineered way to keep the component within design specification. Each server is individually flashed with new firmware to remove errors related to fan RPM. Worth also mentioning that the pumps used also provide similar data to fans, RPM etc. The health of the system is measured very precisely and can all be monitored remotely. I am talking from experience designing and using the system.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like