
Today we rent servers in "the cloud". If liquid-cooled servers become common, does that mean we'd be renting servers in "the puddle"?
An enterprise immersion cooling company has received a $28 million investment it plans to use to sink itself into additional customer datacenters. Green Revolution Cooling, one of the better known datacenter immersion cooling firms in the market, said the cash influx from South Korean company SK Lubricants would be spent on …
I recently worked for a decade in a fairly large data center given it was owned by a small/medium sized financial firm...Lots of screaming fans from 1U & 2U Xeon and Epyc based servers...I few years ago we "stuck our toe in the water" in trying out GRC immersion cooling. Yep that data hall was much quieter, but as a data center tech one has to balance that with the slippery/slimy/clothes-staining nature of gear covered in mineral oil (said oil will never come fully off, no mater how long you let a server drain.)
One issue I continue to wonder about for my comrades left behind in the "server pool", Is how will these oil-soaked servers ever be recycled? Who will want them?
I had no issues getting batches of air-cooled servers recycled for a modest profit, once a year.
...Not to mention oil wicking out of the vats via the twinax cables to the switches. Hmmm.
This would be more attractive for cloud companies, who after initial installation of systems never touch them again, fully lights out operation. They simply allow things to "fail in place". They are most concerned with getting the maximum performance per rack, so anything that helped equalize heat of all components in the rack would be very useful.
The biggest problem I see is weight. A rack filled with mineral oil - essentially a 42U aquarium - would be extremely heavy. Obviously a no go for raised floor, but you probably wouldn't want that anyway as a leak would be an even bigger problem than on a slab. You could build a system to pipe the coolant out the top of the rack and cycle cooled coolant in and allow convection to equalize it. You could run the whole thing at 60C or higher, still well within the limits for the chips but greatly reducing your cooling bill.
As for disposal, if things are being recycled the oil is not an issue, you can use whatever degreaser you want. If you are selling the hardware, you'd need to find a degreaser that won't damage anything.
"The biggest problem I see is weight". We did have the 52U vats (104 servers per vat using 2U / 4 node beasties) on a raised floor (don't remember if we had to reinforce the floor, or just keep the vats a ways apart from each other.)
Of course one of the key selling points of the GRC systems is IF you are starting from scratch, just put the vats on a concrete slab.
"You can use whatever degreaser you want". The labor here would probably eat up any value left in a 5 year old server that you wanted to send to a server "liquid-dator".
---
One interesting point GRC made as to server longevity: The servers would spend their life at a constant temp and that would mean no thermal expansion/contraction cycles as load increased/decreased. In the GRC vats the speed of coolant flow thru the tanks was sped up / slowed down, to match load (and thus hold to a constant, somewhat high temp iirc.)
One negative: The list of vendors willing to warrantee gear bathing in mineral oil is limited.
"You can use whatever degreaser you want". The labor here would probably eat up any value left in a 5 year old server that you wanted to send to a server "liquid-dator".
YOU wouldn't be doing that. You'd sell the greasy equipment to someone, who would probably put it on a ship to Asia where low cost labor would degrease it and then recycle it.
"Not to mention oil wicking out of the vats via the twinax cables to the switches"
Twinax? I thought you were crazy at first, until Google showed me that the word has been recycled by the fiber-optic guys. Couldn't they have picked a less-nightmare-inducing name for such a thing?
This sounded interesting until I clicked on the first link which was to a 3 page ElReg story about the future of cooling data centers from 2015. Many of the same points were made 7 years ago and I assume the challenges and costs were sufficient to delay adoption then as now.
Once over the hurdle of the initial investment, liquid cooling seems to have a lot to recommend it. But I don't see the development of a new dielectric material by SK Lubricants moving the needle much.
The real news is the $28m investment in GRC which since 2009, had only gained $7m in its Series B funding to go with its $430k of seed money.
I know there's a long tradition of oil-filled heatsinks, etc, but do the random collection of circuit boards, chips etc. in a PC all survive immersion in mineral oil? I thought I'd read something a while back about the oil killing some components after a while or making the boards swell/separate?
iirc the only issue with the oil itself in regards to server components was one could not use traditional thermal paste to meld the heat sinks to the processors. Some sort of metal foil was used I think. And yep we used SSDs (no hard drives).
---
On the physical side, the were some access things that made dealing with the servers different from traditional air cooled racks: No access to the front of the server unless the server was unracked ("unvatted"?)...The servers were hung face down (vertically) from the top of the horizontal vat rails. So if the server only had a power button on the front, we had to made sure the BIOS was set for the server to power up automatically when it was plugged in.
Switch placement was a bit of an issue (no one wanted to put those in the oil)...That eventually led me to have to do some oil slow leakage triage via a trip to Home Depot's garden department.
---
Overall I don't think cooling this way is bad...Its just a bit different and the ROI spreadsheet must have looked OK after a couple years, because we added more vats.
> Switch placement was a bit of an issue (no one wanted to put those in the oil)
BIG kiloVolt MegaWatt switches are often oil-filled. Otherwise the arc would never go out. Also keeps air away-- no air no oxidation no fire.
I agree the 30-cent plastic switches in our toys may be dubious in oil.
I imagine you'd have much smaller heatsinks on the CPUs, because fluid is already far better at shedding heat than air, even air moving at a furious pace thanks to an overzealous fan.
This isn't anything new, some models of Cray supercomputers were immersed in a substance similar to AC refrigerant back in the 70s.