back to article SC09: Mineral oil computing - The coming wave?

OK, sure: liquid can hold and transfer way, way more heat than air – we all know that. But is dropping an entire rack of servers into what looks like an enormous deep fryer the right solution? The answer from the folks at Green Revolution Cooling is a resounding, “Yes!” They claim that using their mineral oil submersion …

COMMENTS

This topic is closed for new posts.
  1. Rob 115
    Coffee/keyboard

    Hi my name is Green... no wait...

    haha! Couldn't help but think that poor Christian there has been standing around too long and had enough coffee

  2. Anonymous Coward
    Anonymous Coward

    Savings?

    Save cents on electricity, pay pounds on maintenance, when down time is hours just to change a card, because you need to drain the container first, etc, right? Also, how is the heat removed from the oil? You still need a cooling system, with fans, etc, don't you? Lastly, how do plastics like the bath? Is the warranty still valid if you return an oil-soaked part? Many questions and no answer.

  3. Adam Azarchs

    Sounds messy

    If they're literally submerging motherboards in oil, I don't want to be the one who has to swap out a bad DIMM.

  4. phoenix
    Alert

    Nope

    So we need to tack on the cost of bunding to protect the environment from spillage - bloody great oiltight wall around the servers. What is the flashpoint of this oil if the coolers pack up and it starts to boil?

    Could use dimineralised water (should carry a current) then you would only get wet feet with spilage. Dammed CFCs letting us down.

    1. rciafardone
      Boffin

      Water is not good enough.

      Problem with water is that it will get "dirty" overtime and become conductive, also it constantly evaporates at room tempeture, so you would need sealed containers to prevent te constant lose, not to mention the inclease in humidity. Also is freezes at 0ºC and boils at 100ºC. Oil is much more reciliant to both enviromental low and high tempetures.

  5. Captain TickTock
    Boffin

    Disk Drives?

    I suppose they're ok if oil gets in through the breather holes??

  6. GRCMark

    Company Answers

    Hi everyone, I work for the company and thought I’d answer the questions. The video obviously wasn’t done by our company so therefore we it didn't contain every answer to every question people would ask.

    Our system has a pumping system to send colder coolant into the rack where flow is directed at each server individually and hot coolant is removed and circulated to a heat exchanger. Our software monitors temperatures at multiple server and rack locales and varies coolant flow and temperature in accordance with cooling needs to maximize performance and minimize cooling energy. Server fans are replaced with coolant flow which removes mechanical parts (which break) and removes a large component of server power consumption. So, less power and fewer things to break as mechanical parts (fans + hard drives) and power supplies make up nearly all breakages on a server.

    Other questions:

    Compatibility: The hard plastics are great, although cords (power and Ethernet) are custom made to be compatible.

    Maintenance: Time to remove a server and replace a DIMM is 60 seconds. We demonstrated this literally a hundred times at the conference. The server is easily vertically removed and the very light fluid quickly drains off. This is totally different than other products where each server is individually sealed in a box. You’d be shocked how easy it is. This system was designed for ease of use in a data center environment.

    Net savings: We greatly simplify the data center both in less capital equipment needed and energy savings. Net savings is nearly $2,000 for a 300 Watt server after taking into account changes in maintenance procedures. No CRACs, no chillers, no fans, no server fans, much less power infrastructure. It tends to cost as much to build the data center ($10/Watt) as it costs to buy the servers. And obviously less energy as cooling energy is nearly eliminated while server power is reduced nearly 20%.

    Flashpoint: The oil's flashpoint is ~100C higher than the point at which the computers would shut off from overheating. The MSDS for flammability is a "1", which is quite low. We have run test servers at 60 - 65C with forced convection and had no trouble due to the superior cooling ability.

    Environmental: Fourth, mineral oil is drinkable (often used as a horse laxative) and the key ingredient in baby oil.

    Hard drives: Three modifications are made: servers fans are removed, hard drives are protected, and the thermal grease is replaced with a more robust solution. Time to modify a computer is under 2 minutes. Any server from any OEM can be used.

    Anything else, please just ask. And yes, Christiaan did have too much coffee. Visit www.grcooling.com for more information.

  7. danolds
    Alert

    Horse laxative?

    I wish I'd have known about the horse laxative angle when I was writing the story - I would have definitely included it in the text and maybe even added some video of a horse to drive the point home. As an IT analyst and tech industry consultant, I would strongly advise staying away from any messaging that explicitly mentions "horse laxative"...

  8. Anonymous Coward
    Alert

    Not Really New

    So we have to tip all our racks 90 degrees? That is going to take up a lot of floor space.

    Wait, I see. In order to save all this money on cooling we have to replace all our gear with submergable servers.

    BTW -

    Immersing equipment has been done for years in military and communications applications(not to mention large power-line transfers)

    1. GRCMark

      Response to Coward

      Coward-

      We can use any off-the-rack servers with 60 seconds of mods, including removing the fans and covering the hard drive with our protective covering. So, you’re not replacing equipment. However, your cooling equipment will all be obsolete.

      Second, we save space. First, no hot/cold aisles, no CRACs or CRAHs, no raised floors. Think about it, a data center is mostly empty space. Our racks are places back to back while standard racks need to have an aisle between each row for air circulation.

      Second, you can fill your rack with whatever power density you want. Net/net you should save space.

      And you’re right, using a dielectric fluid submersion to cool and protect electronics has been done before. However, we have done novel changes to make it work cheaply in a data center for computer servers.

This topic is closed for new posts.

Other stories you might like