20May
GRC's Cold Hard Facts Series - Liquid-Immersion Cooling Myths: Debunked: Part 1

Cold Hard Facts of Immersion Cooling – Part 1

Submitted by Dhruv Varma, GRC Director of Product Marketing
On: May 20, 2019 Comments: 0
Follow Us for the Latest Insights on Data Center Cooling!
 

The advent of applications such as AI, IoT, AR, as well as microservices has many data center operators scrambling. They’re looking for ways to break through barriers relating to rack density, power availability, location options, and data center design as a whole. Liquid-immersion cooling has long been seen as a very effective solution to all these. But, as with all non-traditional approaches, myths abound. In this first of a multi-part series, we’ll separate fact from fiction and explain just how effective data center immersion cooling can be. You’ll learn about ways to slash costs, speed implementation, improve performance, and future-proof your data center.

Our solutions have been battle-tested and proven to work. Perfected over the course of a decade, our liquid-immersion cooling systems have been installed by some of the world’s biggest operators in a breadth of industries. The list includes three of the top 10 telecoms, top supercomputing facilities, one of the web’s largest search engines, and government agencies, such as the DoD, Air Force and NSA.

First deployed in 2010, GRC’s ICEraQ™ micro-modular racked-based liquid cooling system is now used in cutting-edge data centers around the world, racking up millions of hours of reliable run-time.

Nothing could be further from the truth. GRC’s ICEraQ and ICEtank™ systems support rack-based servers from all major OEMs – Dell, Supermicro, HP, Cisco and many others. These servers only require fan removal before submersion.

If you’re considering a “white box” approach by building your own servers – or if you want to maximize space, power and budget, our systems also easily accommodate servers designed for immersion (SDI). These servers can include premium components from brands like NVIDIA®, Supermicro and others.

Learn More About Servers for Immersion Cooling

Yes, liquid-immersion cooling solutions like those offered by GRC can cool over 100 kW/rack, making them ideal for high-density applications. But immersion cooling is actually one of the best ways to solve an array of troublesome (but all too common) data center challenges. It’s truly multi-talented.

If you’re out of power, or trying to overcome the steepening costs of power, this highly energy-efficient technology enables you to cut cooling energy by up to 95%. So you’ll get more compute from your power envelope. Out of space? With its higher rack densities, liquid-immersion cooling enables you to put more computing power into your existing data center footprint. Or, since it requires none of the infrastructure traditional data centers do – raised floors, chillers, etc., it allows you to easily occupy spaces within your building that otherwise couldn’t be used as a data center.

GRC also offers a modular “data center in a box” solution. These empower you to easily and quickly deploy data center capacity virtually anywhere – in a parking lot, for example, or in harsh environments where other systems could never operate.

And liquid-immersion cooling is also a great choice if you need to expand but are trying to avoid high capital expenditures. Our customers report significant CAPEX savings at rack densities as low as 15kW.

Liquid-Immersion Cooling Capacity: 100+ kW/rack

Our ICEraQ micro-modular system and ICEtank modular liquid-cooled “data centers in a box” have helped many customers build out data centers for up to 50% less than traditional alternatives.

Estimate Your Savings with Our TCO Calculator

What’s more, these installations continue delivering significant cost savings on energy and maintenance. They also provide a lower Total Cost of Ownership (TCO), and serve as an excellent long-term return on investment.

There’s also some confusion between single – and two-phase immersion cooling, the latter of which is more complicated, hence more expensive. FYI, GRC helped pioneer the simpler (and less costly) single-phase immersion cooling approach for data centers.

GRC systems use a clear, odorless, non-toxic coolant called ElectroSafe™. ElectroSafe is a high-performance synthetic blend with 1,200x the heat retention capacity of air by volume. It offers exceptional, low-cost cooling for data center servers.

The coolant does not evaporate, readily ignite, is not reactive with water, and is also very stable, even when exposed to flame.

Read More About ElectroSafe Coolant

Liquid immersion is definitely a different approach to data center cooling. Different but better.

The single-phase systems engineered by GRC are made operationally simple from the ground up. The open-rack design enables quick access to server equipment. And the movable server support bars facilitate hot-swapping of components right over the rack. Service rails help create a convenient waist-high work bench. This means you’ll never need to lift the heaviest (densest) server from the bottom of the rack in order to work on it.

Watch: Servicing an Immersion-Cooled Server

But that’s not the only way liquid-immersion cooling contributes to a better work environment. It’s also virtually silent; so you can have a conversation and problem-solve right at the rack. And, there are no hot-cold aisles to deal with.

Finally, because our specially formulated ElectroSafe coolant is very similar to mineral oil (the main component in baby oil, eye drops, etc.), it is a lot easier to handle and work with. It never needs replacing, and will last the lifetime of the data center – Even longer.

Got an opinion? Let’s hear it

Myths die hard. But the truth lives forever. If you’ve taken a stand on liquid-immersion cooling, we’d love to have you share it. Just post a reply in the Comments section below. Or send us an email at info@grcooling.com. A GRC associate will reach out and continue the conversation.

Now, check out the next installment in this series: Cold Hard Facts of Immersion Cooling – Part 2