back to article As liquid cooling takes off in the datacenter, fortune favors the brave

Hype around liquid and immersion cooling has reached a fever pitch in recent months, and it appears that the colocation datacenter market is ready to get in on the action. Recently, Zachary Smith, Equinix's global head of edge infrastructure services, told The Register that the outfit would like to offer customers more liquid …

  1. Gene Cash Silver badge

    isn't a niche tech suited only for exotic supercomputing applications

    Well, no. It's also a tech suited for keeping your computer from being a horribly noisy monster. I've watercooled my PC (and installed SSDs) for the last 4 years to avoid going deaf.

    D5 pumps are incredibly quiet, especially compared to a Seagate drive.

    I've only been down once to replace a failed pump. I had a spare, of course.

  2. captain veg Silver badge

    I'm not against...

    So far as I'm concerned data centres can make as much heat as they like, so long as it's used productively. Which mostly means heating people's homes.

    Data centres simply chucking out heat into the atmosphere is unconscionable. Without qualification as to where these 600W parts are chucking their entropy, I can't judge their worth.

    -A..

    1. Palebushman
      Boffin

      Re: I'm not against...

      You are absobloodylutely on the money captain veg ! That sort of system has been used to great success by huge power stations and similar industries, why not the 'Cloud' farmers too?

    2. MachDiamond Silver badge

      Re: I'm not against...

      "Which mostly means heating people's homes."

      That would mean a vast amount of infrastructure to pipe that heat to those homes. It would be simpler to locate a business nearby that uses a lot of heat in their processes and sell to them.

      1. captain veg Silver badge

        Re: I'm not against...

        Insulated pipes are being buried right now under the road that goes past here, which I assume come from the CHP plant they built a few years ago. It's hardly rocket science.

        I suppose it helps that in this part of the world most people live in apartments.

        -A.

        1. Dimmer Silver badge

          Re: I'm not against...

          When your datacenter is in an office building, it is easy to vent the hot Isle into the building when it is cold, and vent to the outside when it is hot out. We do. It is so nice when you just need to run the intake fans to cool the datacenter. Pay close attention to the humidity tho.

          If you really want to save energy, get rid of spinning rust. Massive savings.

          If we can just prevent the manufactures from requiring us to buy their last generation overpriced SSDs and put current generation that we want in it will be cost effective as well.

          1. captain veg Silver badge

            Re: I'm not against...

            I swapped out spinning rust some time ago, and can't imagine a scenario in which I would recommend anyone doing otherwise.

            That's a different thing from using your excess heat for productive purposes.

            -A.

      2. Paul Hovnanian Silver badge

        Re: I'm not against...

        "Which mostly means heating people's homes."

        Not just homes. With a bit of forethought, the waste heat can be put to use in a lot of other places. Greenhouses and other places to assist warmth loving species like manatees. While this example is just an unforeseen byproduct of the power plant's output, with a bit of planning, a much more suitable environment could have been engineered. As it is, it keeps the poor things from freezing to death as the local ocean currents cool off.

    3. Anonymous Coward
      Anonymous Coward

      Re: I'm not against...

      Agree, apart from the output being mostly low-grade heat; and therefore difficult to do much useful with.

      I've looked at various strategies for heat recovery, and it is not at all straightforward no matter the heat source. Crossrail learning legacy have a great white paper on a system proposed and designed for heat recovery at train braking points. It wasn't built, but is a very enlightening discussion as to why heat recovery is often not very cost effective.

      1. captain veg Silver badge

        Re: I'm not against...

        Indulge me.

        Did your analysis look at simply piping low-grade heat into people's homes?

        -A.

  3. WolfFan

    Zachary Smith?

    Would that be _Doctor_ Zachary Smith?

    All together now… Danger, Will Robinson, danger!

    Sigh. Showing my age.

  4. Neil Barnes Silver badge
    Coat

    To be honest

    I'm looking forward to the day when scuba qualifications will be an essential part of the data engineer's CV.

  5. NeilPost

    Drains ??

    Mixing electrical systems and plumbing… what’s the worse that can happen !???

    Having had AC units spring a leak in a server room, the plumbing for liquid cooling needs to be bullet proof.. or some thought needs made as to what to do when failure/leaks happen. Some floor drains like an abattoir ?? Though in a multi-floor DC this would be prohibitive cost, but also prohibitive risk.

    Also fail-safe - for his you can’t shut down non-essential and merely chuck the server room door open, put some fans/portable AC in. On failure your only option is OFF.

    I love the liquid cooling failure scene in Danny Boyle’s Sci-Fi film Sunshine/

    1. Potty Professor
      Boffin

      Re: Drains ??

      I worked on the design of the cooling system for the Type 45 Destroyer, now called the Duncan Class. It consisted of de-ionised water circulating through aluminium heatsinks for the IGBTs, air circulating around the outside of the devices, and Midel circulating through the heavy current copper busbars and transformer windings. All three systems dumped their heat into a salt water secondary system that could either be discharged overboard for normal running, or stored aboard to reduce the thermal wake if running silent. I don't remember the thermal loads, but I have them somewhere in my archives, must look them out sometime.

  6. mikepren
    Holmes

    the more they change

    Takes me back to 1983 ish and IBM 3033 /3081.

    The local ibm ce got given a mounted fire axe after he used it to open a hole in an outside, locked, door after the water cooling sprung a catastrophic leak.

  7. Lopan

    I worked with a Cray 1 supercomputer long ago. Freon cooled (liquid). Liquid cooling is not all that new

  8. MachDiamond Silver badge

    Heat is a marketable product

    Waste heat can be a revenue stream if a data center plans well and has a partner next door. To make liquid cooling work, motherboards need to be configured so heat generating components can be thermally coupled to a heat exchanger. The connectors used to connect each server in a rack can be specific to the data center if they like by having a standard pipe thread on the heat spreader the connector can be screwed into. There are lots of different self-closing liquid connectors that could be used. I have one type on my magnetizer to cool the magnetizing coil using a glycol based coolant and others for my hydraulic pump so I can plug in different tools. I might get a drip, but both sides will seal when disconnected. Limiting the number of connections and using properly tested connectors and hoses will be important. Go cheap and suffer the consequences. A few backed-up pumps will likely consume much less power than hundreds of box fans and blowers.

    1. Roland6 Silver badge

      Re: Heat is a marketable product

      > Limiting the number of connections and using properly tested connectors and hoses will be important.

      A likely candidate would be the Speedfit system - proven over many years in millions of homes and with people skilled in installation...

      But being IT expect the need to reinvent the wheel and gold-plate it...

      1. MachDiamond Silver badge

        Re: Heat is a marketable product

        "But being IT expect the need to reinvent the wheel and gold-plate it..."

        That's a take I had on the article. I don't see as having any sort of standard for the couplings really matters if cooling loops on the server machine are provided with bog standard pipe thread ports so whatever the designer chooses just screws in. All of my pneumatic tools came without fittings and I just screwed in the part that matches my air hoses. Having a choice can mean lots of options.

  9. Dr Dan Holdsworth
    Boffin

    As I see it, the biggest technical hurdle is not the actual cooling, but the liquid handling. I think that this could be simplified by running conventional heat pipes from the heat-producing chips to a dedicated heat exchanger block at the back of each machine.

    That design would mean only two plumbing attachments per machine, and those could be much more tolerant of higher pressures of coolant, plus you lose the multiple coolant connectors inside the machine chassis.

    1. Dimmer Silver badge

      two plumbing attachments per machine

      Most servers have 2 or more power supplies and more than one fan. For me to even consider it, it must have the redundancy for me to be able to service without shutting down the server. Redundant from the outdoors to the processor.

      I am interested if they can get past that.

      1. Roland6 Silver badge

        Re: two plumbing attachments per machine

        >Redundant from the outdoors to the processor.

        I think you will find those dual power supplies are only redundant outside of the chassis - the motherboard/CPU only has one power line...

        Hence designing a liquid cooling system with the same level of redundancy would be relatively simple.

        But then if you have your server blades correctly configured pulling out one will simply result in the load being transferred to another physical server in another cabinet/datacenter - we were doing this with online transactional processing systems 20 plus years ago with users depending on what they were doing seeing a "hiccup" in service.

  10. ricardian

    Microsoft used the cool seawater around Orkney in a two year test 2018-20 of a submerged cylinder full of servers. https://www.bbc.co.uk/news/technology-54146718

    1. Ashto5

      Ocean warming

      Cool that’s what the ocean. Weds right now a giant heater dropped in it.

      MS may not have thought this through.

      1. J. R. Hartley

        Re: Ocean warming

        @Ashto5

        In your enthusiasm, you've typed that too quick, haven't you.

    2. MachDiamond Silver badge

      "Microsoft used the cool seawater around Orkney in a two year test 2018-20 of a submerged cylinder full of servers. "

      That's just chucking the heat overboard. It's much more impressive to have ideas that use that energy source for something useful.

  11. Paul 87

    There must also come a point whereby it makes sense to fit a power generation or building heating system into the cooling loop, particularly if you don't just balance the waste heat over the datacentre, but combine it with the entire building (with emergancy area isolation!)

    After all, we can use heat energy in a variety of ways, and convert it back to electricity, especially if there's a fluid medium involved.

    1. MachDiamond Silver badge

      "After all, we can use heat energy in a variety of ways, and convert it back to electricity, especially if there's a fluid medium involved."

      The temperature differential between the cooling output of the data center and ambient wouldn't allow very efficient power generation. This is why most nuclear reactors that use water pressurize the system. It raises the boiling point of the water so there is a greater temp differential. All of that was worked out way back at the dawn of steam. I had a nice technical discussion about the sorts of efficiencies that might be possible by using RTG's on the moon with an engineer from Teledyne at a JPL open house years ago. I was very interested in a system based on RTG's partially for the co-gen possibilities.

  12. Anonymous Coward
    Anonymous Coward

    There were demos in Taiwan recently of immersion cooling using 3M Novec.

    It's a material with some interesting and useful properties, however, filling equipment with significant chemical and/or biological risks won't be for everyone.

    Mineral oil immersion seems probably the least hazardous route, apart from making it much harder to service equipment.

    With energy costs as they are there is strong motive for data centres to cut the LVAC bill so I would expect this to become the norm.

    1. oldgreyguy

      It would be hard to keep the oil out of the floppy drive

  13. bernmeister

    Horses for Circuits

    Immersion coolant can take many forms. I have experience of immersion cooling of PSU,s. Very successful, just remove the fans, wash off heat sink compound and dump the supplies in coolant. Not really that simple but thats the principal. Some components are not compatible with the coolant, they need replacing with a different type. Eventually an extremely effective system was the result. The supplies were used to power bit-coin mining systems so cost effectiveness was essential.

  14. Henry Wertz 1 Gold badge

    Leaks

    I think an important consideration if this is done at scale is what happens if there are water leaks. I'd want to arrange lines and coolers so if they leak, they drip onto the floor instead of into switches, power distribution, etc., you'd want to have it so if liquid did get on the floor there's some way to handle that (which might be just to let it evaporate if it's under a rack, you'd just have to make sure there's no cabling running where it'd get potentially immersed). I'd also like to make sure, if the worst happened and there was an internal leak, that the server or power system would blow a fuse to it rather than having it burn up (if it's not possible to use non-conductive coolant.), and even if the coolant is non-conductive it'd be good to have enough monitoring in the system to quickly isolate leaks (it'd be a rude surprise to find out that some server is running fine since it's non-conductive, but some leaks been dripping for months on it and it's half-full of liquid.)

    I haven't heard of people having big issues with liquid cooling on gaming PCs. But, that said, there's a big difference between doing it on one system where the gamer carefully puts it together, and probably has one of those cases with the clear side panel so they can look in and see what's going on... and a datacenter were you'd have 100s and 100s of machines in racks where you can't casually glance inside to see how it looks.

  15. Ali Dodd

    Have rear door water cooled DC here

    Have rear door water cooled DC here and i work in a middling UK University - it's great the DC is much quieter. Every rack has a hulking rad on it to pull the heat out of the kit. Was put in over 10 years ago so nothing new, thought it was the new big thing in DC cooling yonks ago.

  16. stevebp

    What message is Equinix sending us here?

    Equinix is a massive beast - with a huge amount of sunken cost in traditional cooling techniques in their DCs.

    Putting in immersive cooling is disruptive to its business model because it requires de-installation of traditional cooling and installation of pipework (not an obstacle actually despite what it says), but it will be costly and it won't have the skills in-house to support it right now.

    https://exuvi8.co.uk/ has designed a proof of concept DC that takes all different types of cooling technologies and demonstrates how they can be implemented side by side in a colo environment. So standardisation may be lacking, but the business model isn't - it's a reluctance to do something different that seems to hold them back.

    What is strange is that the potential gains from a far lower spend on cooling (from lower pPUEs to removal of large mechanical systems and the freeing up of space for future whitespace use, does not appear to be attractive enough for Equinix. But maybe its competitors will think differently.

    The stronger focus on sustainability reporting encompassed in the EU's CSRD, will compel Equinix to think more carefully about how it is perceived, when it has to publish its consumption and emittance of energy.

  17. prof_peter

    Quoting the folks who designed our 10MW data center, the main reason liquid cooling might be a good idea is that it gives you "high quality heat".

    For obvious reasons it's hard to run air cooling much higher than the 100F or so that we run hot aisles in our data center, and in my experience even that sucks quite a bit. If you cool your racks like we do, with chilled water from a central A/C system running to heat exchangers in your racks, the return "hot" water is probably going to be something like room temperature, and depending on your climate, for much of the year you're going to need quite a bit of A/C to dump that heat into the outside air. (some of you might recognize chilled water to distributed heat exchangers as being a common office air conditioning setup. Those of you who don't are the ones who never had one of those exchangers go bad right above your colleague's cubicle)

    In other words, what big air-cool data centers are doing is generating heat in a server, blowing it out into a (typically enclosed) air space, and then using fans and something that's basically the opposite of a radiator to suck the heat back into a pipe full of water so they can get it back to the central A/C system and get rid of it.

    With liquid cooling you get rid of that horribly clunky intermediate step, and more importantly you can run the water a lot hotter than you can run the air in a room used by humans. If the water coming from your racks is 70C or so, it's really easy to get rid of the heat - all you need to do is pump the water around to an outside radiator or evaporative cooler. (well, with a heat exchanger or two and some other complicated but non-power-sucking stuff, but we don't need to get into that)

    Basically you want the heat to run "downhill" - if you can extract heat in a form that's significantly hotter than the great outdoors, then all you need is a bunch of pumps to get rid of it.

    And that, in a nutshell, is the argument for datacenter water cooling - it's much more efficient at a whole-facility scale, as it gets rid of almost all your air conditioning and turns it into mere water pumping. It might also let you put more watts into a rack because you get rid of that blowing-air-around step in the cooling chain, but that's probably a secondary benefit. Finally, since your existing data center cooling probably involves pumping chilled water around already, you might be able to convert over incrementally, or mix air and water cooling.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like