back to article Intel details coral-shaped immersion cooler that bubbles like Mentos in Coke

Intel on Tuesday detailed a novel immersion cooling heat sink design inspired by coral reefs, which it's working on thanks to a $1.7 million award from the US Department of Energy (DoE). The x86 giant scored that cash under the DoE's COOLERCHIPS initiative, which aims to encourage tech that reduces the amount of power expended …

  1. ColinPa

    What goes around ...

    I remember mainframes being water cooled, and heating the site "for free". When they became more efficient, and not needing water cooling, they had to install a big gas pipeline, and boiler to heat our site.

  2. m4r35n357

    No elegance, no class

    Ludicrous "engineering" in the face of a hard physical limit. Yeah, just turn it up to 11 . . .

    1. Anonymous Coward
      Anonymous Coward

      Re: No elegance, no class

      Hey, if it works...

      Seems like the real answer is to decrease the workload. I would think dropping the collection and processing of user data for advertising, and outlawing kleptocurrencies, would significantly decrease the total computing power needed on the planet.

    2. Kevin McMurtrie Silver badge

      Re: No elegance, no class

      If you know how to compute without generating heat, the world would like to know.

      The amount of power lost putting some electrons into a FET gate then removing them is very close to nothing. Like Depeche Mode said, everything count in large amounts. A GPU can have 100 million FETs and a number of them are switching billions of times a second. It gets warm.

      Analog computers are far more efficient but dynamic programming remains tricky.

      1. m4r35n357

        Re: No elegance, no class

        Obviously not. But I do have engineering experience of distributed circuit effects (including GaAs MMICs) at high frequencies, parasitic resistance etc. Perhaps treat it as a real constraint and not come over all "King Canute" when faced with the uncomfortable realities. Some genius is bound to try Peltier cooling these chips at some point . . . !

        1. Anonymous Coward
          Anonymous Coward

          Re: No elegance, no class

          Peltier cooling is extremely inefficient compared to something like Intel's proposal. It also is quite limited in how much heat it can remove, and the parts are much more expensive when cooling large amounts. Intel's idea is pretty clever - use a liquid with a boiling point just below the target temperature, and the evaporation of the liquid absorbs far more heat than just flowing a liquid across it, much less trying to air-cool. (The phase-change happens at the boiling temperature, so the heat source's surface wouldn't exceed that.) For added effect, arrange the hot surface to be vertical (like in the photo), and the cool, fresh liquid will come in from the bottom, pulled by the rising of the the gas bubbles - natural convection. Then the liquid at the top of the tank can be pumped into a conventional chiller. Seems like this would get past the "hard physical limit" and "uncomfortable realities" you were talking about, unless I misunderstand what those vague terms are referring to.

          (An example of a good application of Peltier cooling is keeping samples cool while viewing them with a microscope. Conventional cooling is physically too big and large-scale for that.)

      2. PRR Bronze badge

        Re: No elegance, no class

        Analog computers are far more efficient

        Citation needed.

        I have never seen an analog computer come remotely close to the quantum limit of computing. In general, if the standing current is not large compared to signal, the self-noise disturbs the accuracy.

        Or: a digital hearing aid sucking 2milliWatt does more sound correction than a 200 WATT rack of opamps. (I used to design PA sound processors, and have that hearing aid in my ear.)

        1. Kevin McMurtrie Silver badge

          Re: No elegance, no class

          I don't know who built a board crammed with op-amps but I can say that it's not a normal thing to do. They're bulky, expensive, and add too much signal latency.

          There are companies building analog chips for battery powered AI. Last I checked, the tech was linking flash or memristor elements with simple analog circuits to produce low precision computations with an extremely high power efficiency.

  3. juul

    If this is the true reason Intel got 1.7 million, then they should not have gotten a dime.

    [quote] The x86 giant scored that cash under the DoE's COOLERCHIPS initiative, which aims to encourage tech that reduces the amount of power expended on datacenter cooling to less than five percent of the energy expended on IT itself. [/quote]

    They still need the same amount of cooling. They just focused the cooling to the liquid.

    1. Anonymous Coward
      Anonymous Coward

      Correct, the processors still need the same amount of cooling. The goal is to use less energy in the cooling process. Phase changes take lots of heat, which typically makes a system like this more efficient than convective cooling with a single phase fluid.

      As a bonus, more efficient cooling means a nominal decrease in the total waste heat to remove from the entire system (less heat generated by the cooling system itself).

  4. martinusher Silver badge

    Intel Invets Boiling!

    Its the phase change. It requires significant amounts of energy. So I guess that "Intel Invents Boiling" is the headline.

    All Intel needs is a suitable fluid to immerse their chips in and we're all set. A fluid like the sort that's used in A/C systems. The sort of fluid that's relatively expensive, carefully recycled because even the 'non ozone hole' sorts aren't good for the environment and as a sort of bonus somewhat hydroscopic.

    Based on my experience with computing at home all we really need to cut energy use is cut the software cruft that infests our machines -- snoopware, advertising, overweight design and so on. Chief power offenders on my system are Google Chrome and (of course) Windows. Cycle hogs that demand ever increasing processor power (and multiple cores because people can't design code that doesn't get stuck in busy/wait loops.....)

  5. DJV Silver badge

    Maybe...

    ...future Intel chips will come with a washing temperature label like clothes do.

    I wonder how many will ignore the "Do not iron" graphic on the label?

  6. Pascal Monett Silver badge
    Flame

    Intel, Nvidia, HP, Raytheon

    Intel : Full year 2022 results ? $63.1 billion

    NVidia 2022 : $26.91 billion

    HP 2022 : $63 billion

    Raytheon : 2022 income at $5.2 billion (for some reason, Raytheon prefers to officially talk about its sales and share price instead of its revenue)

    So, tell me why these companies needed a few piddling millions of government money to do that research.

    That money should have been awarded to startups that had a good idea (okay, have to find some, but still).

    1. John Brown (no body) Silver badge

      Re: Intel, Nvidia, HP, Raytheon

      "So, tell me why these companies needed a few piddling millions of government money to do that research."

      When you are big enough/rich enough, you never spend your own money. That's how you get to stay big/rich :-)

  7. Hurn

    Why 2KW?

    "cope with two kilowatt chips"

    Does this mean Intel is planning on introducing 2KW chips, but realized they'll first need a cooler / cooling environment, and manged to get the US DoE to fork out $ for the research?

    Maybe someone else already has a 1 KW thermal solution, so now, they're going for 2?

    Guess it's not a matter of "if," but "when"

    Hmm.. change out the working fluid for something that boils at a much lower temp, and maybe this thing can work for quantum computers, too?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like