Liquid Cooling in the Data Center: State of the Industry

42U Rack filled with GPU-accelerated servers

Paul Rubens over at Enterprise Networking Planet wrote an article last week that looks at the current state of liquid cooling technologies available to data centers. The piece, Liquid Cooling Gaining in Popularity Again, presents some key reasons that liquid cooling is gaining traction in the data center:

…thanks to spiraling energy costs, corporate green initiatives and new high-tech coolants, the concept of liquid cooling in the data center is enjoying a renaissance.

While these can be factors in the decision to switch to liquid cooling, we find customers turning to our liquid cooling solution for other reasons. For example, the CarnotJet system offers:

  • Lower upfront costs and lower build-out costs than air cooling
  • Reduction of cooling power use by 90-95%, server power use by 5-25%, and overall data center power use by 40-45%
  • The ability to cool the densest hardware available today, including GPUs, with support for up to 100 kW of compute per 42U rack

The article continues with a description of current dielectric fluids in use:

These liquids do not conduct electricity so they can be in direct contact with electronics without causing any damage, and since they are many times better than air at capturing and transporting heat they offer the prospect of dramatically more energy-efficient cooling than is possible using the conventional chilled air approach.

It’s no secret that GreenDEF™ dielectric fluid coolant is highly refined mineral oil. Unlike Fluorinert and other solutions of the past, our coolant is inexpensive — and because it’s essentially baby oil, it’s environmentally and human friendly. There’s also no risk of evaporation or spoiling, which allows for our easy-access open-system design.

Rubens does a nice job discussing the benefits of submersion cooling in the article:

The potential benefits of liquid cooling in are significant. For starters, a suitably designed system can capture almost all of the heat generated by a server’s components, so there is no need to power internal fans to assist airflow. That in itself can reduce server power consumption by about 30 percent. But the main savings come from reduced air cooling costs. Since heat from the servers is captured by the coolant and removed without warming the air around the server racks, there is little or no need for computer room air conditioning (CRAC) equipment. And since the electricity needed to power CRAC equipment, chilling plants and other cooling equipment may account for as much as 30 percent of data center running costs, the potential savings are enormous.

Removing server fans is an important energy-saving practice in our submersion cooling system. In a liquid environment, fans are obsolete! As GRC Founder Christiaan Best said in an interview at Texas Advanced Computing Center, submersion cooling replaces managed air flow with managed liquid flow. As Team Texas discovered at the Student Cluster Competition during SC’11, removing server fans can provide significant energy savings — typically 5-25%.

And because liquid so efficiently captures server heat and makes it accessible, it is also possible to reuse that heat. In November, Green Revolution Cooling announced heat recapture for data centers at KTH (Royal Institute of Technology) in Stockholm, Sweden. Working with GRC’s technology, researchers at KTH have achieved 50°C water output using commodity server hardware, a resource that can be useful year round. 

The article offers a paragraph about Green Revolution Cooling, specifically:

Green Revolution Cooling, a Texas-based based company, offers probably the most radical departure from traditional data canter cooling. Its CarnotJet system is based on the concept of dunking — literally placing an entire server rack into a tank of its GreenDEF coolant.

It may seem like a radical departure from the norm, but the dramatically reduced costs and increased performance are a welcome change.



Sign up for GRC’s newsletter!