The Future of HPC and Immersion Cooling: What’s Ahead?

High-performance computing (HPC) has been pivotal in accelerating scientific research and facilitating innovations. In fact, recent projections show the HPC market is expected to grow to $49.9 billion by 2027.

But what exactly is HPC? And what does it take to maintain it?

HPC is the use of advanced computing techniques to process big data and solve complex calculations at fast speeds. In recent years, high-performance computing has been used for climate modeling, drug discovery, and even genomics. It’s also been helpful in virtual prototyping for engineering industries and risk analysis for the financial sector.

But HPC demands intensive processing power. It can quickly generate heat, which degrades performance and reduces hardware reliability. So, it’s crucial to implement an efficient cooling system. In this guide, we’ll discuss the benefits of immersion cooling systems for HPC and the future of the industry.

The Development of Immersion Cooling in HPC

The use of immersion cooling dates back to the 1800s, though at the time the technique was only used for the thermal management of electric systems like transformers. The first recorded mention of using this practice on computers wasn’t until 1966, when Oktay Sevgin of IBM used dielectric fluids to cool computers. In 1968, two other employees from IBM, Richard Chu and John Seely, patented an immersion cooling system for modularly packaged components.

At this time, high-performance computing was in its infancy—but it was quickly growing. As supercomputers advanced, they required more maintenance and better ventilation to keep them as efficient as possible. But methods like air-based cooling and liquid cooling weren’t enough to dissipate the heat from densely packed electronic components. To address this, Cray Research, LLC Founder Seymour R. Cray Jr. conceptualized the immersion cooling technique for high-density electronic assembly.

HPC data center

Using Cray’s approach, the electronic components of HPC are submerged in a dielectric fluid with high thermal conductivity. This allows for rapid heat dissipation and eliminates the need for air circulation. It also makes it easy for higher hardware densities to maintain their computational power and efficiency.

This innovation has marked a significant shift in the HPC industry and established new benchmarks for performance and reliability. It also paved the way for advancements in cooling technologies across industries.

Today, companies like Green Revolution Cooling (GRC) have become torchbearers for immersion cooling innovation. For example, GRC rebooted the open bath immersion cooling concept in 2009 to handle higher power densities and operate in a cleaner, quieter, more cost-effective way.

Benefits of Immersion Cooling for HPC

Immersion cooling offers many benefits for high-performance computing environments. It’s an increasingly attractive solution for organizations that want to maximize performance, efficiency, and sustainability.

Let’s explore some of these benefits further.

  • Enhanced Thermal Efficiency: Liquid has a higher heat capacity and thermal conductivity, so it can absorb and dissipate heat more efficiently. This ensures hardware components stay within optimal operating temperatures and prevents thermal throttling.
  • Improved Energy Consumption: Because immersion cooling doesn’t rely on fans and air conditioning units, it minimizes energy consumption and overall operational costs. What’s more, immersion cooling systems have an impressive power usage effectiveness (PUE) of 1.2. This is 96% more efficient than air cooling systems, which often have a PUE ratio of 2.
  • Minimized Noise Pollution: There are no moving parts involved in the immersion cooling process, so this system tends to operate as silently as possible. This reduction in noise levels offers a quieter and more conducive working environment for data center staff.
  • Longer Hardware Lifespan: Using immersion cooling for HPC enables you to maintain stable operating temperatures for your hardware. This prevents thermal stress and reduces wear and tear on hardware components like CPUs, GPUs, and memory modules. As a result, you can reduce maintenance costs and enhance the longevity of your HPC infrastructure.
  • High Scalability: Immersion cooling architectures can accommodate higher hardware densities within the same physical footprint. This scalability uses both resources and space efficiently and facilitates seamless growth without compromising performance.
  • Reduced Carbon Footprint: Since it maximizes HPC efficiency while lowering electricity consumption, immersion cooling can significantly reduce the carbon footprint of your infrastructure. Organizations that adopt this kind of cooling system can demonstrate their commitment to sustainability and environmental stewardship.
what is HPC

Market Trends and Growth

Immersion cooling technology has come a long way since its inception, emerging as a transformative thermal management solution in data center and HPC environments. In addition, the use of this method has extended beyond traditional HPC settings. With its increasing popularity, experts project the immersion cooling market to increase at a compound annual growth rate (CAGR) of 24.2% from 2022 to 2030. This would amount to a market value of $1.7 billion.

Recent data shows this can be attributed to various factors. For instance, the expansion of Internet of Things (IoT) technology, the rise in quantum computing applications, and the popularity of cryptocurrency. As the world embraces these technologies, we can also expect immersion cooling to be at the forefront of ongoing sustainability efforts within the tech industry.

HPC Cooling Challenges and Considerations

Switching to an immersion cooling system for your HPC infrastructure can pose several challenges. After all, this kind of system requires a different approach to hardware design and deployment. Organizations that intend to adopt it will need to understand its deployment and maintenance requirements.

One of the primary perceived hurdles in adopting immersion cooling for HPC is the upfront investment. That is, you must either retrofit existing data center infrastructure or build new facilities designed for immersion cooling. Acquiring specialized equipment—such as immersion tanks, pumps, and fluid management systems—does come at a cost.

Upon further inspection, however, this need not be the challenge it seems. The cost factor impacts businesses retrofitting infrastructure that previously depended on air-cooling systems. If you’re building a new data center, it can actually prove less expensive to install immersion cooling from the get-go. And even if the cost of retrofitting seems daunting upfront, the savings over time from reduced energy output will more than equalize the expenditure.

Perhaps of more salience are the technical considerations. First of all, the system you choose should be compatible with your existing infrastructure. It should accommodate your HPC’s different components, such as servers, GPUs, CPUs, and storage devices. This can help minimize implementation risks and ensure long-term interoperability.

Compatibility aside, you should also look into the integration process with your existing data center infrastructure. Try to evaluate how much time and effort it would take to manage power distribution and networking, among other things. This can help you calculate overall implementation costs more accurately.

Due to the increasing use of big data, HPC infrastructures are under pressure to keep up with the demand. So, the immersion cooling system you invest in must also be scalable enough to continually adapt to the changing workload demands.

Immersion Cooling—A Solution For Sustainable HPC

The use of immersion cooling in HPC environments signifies a transformative shift toward enhanced performance and sustainability. As it becomes a more popular choice among data centers, it can encourage innovation among industry stakeholders and drive advancements in cooling technology. Looking ahead, this may lead to the development of novel cooling architectures and thermal management best practices.

Stay ahead of the curve by investing in immersion cooling technology. Contact GRC today to discuss your data center cooling needs.


Exploring Advanced Liquid Cooling: Immersion vs. Direct-to-Chip Cooling

Our digital world is expanding faster than ever, and data centers work hard to process all our Netflix binges and online transactions. Behind the scenes, there’s a crucial player that helps these data centers run smoothly: cooling systems. In general, data center operators have two options for keeping their equipment cool. Air cooling has long been the go-to method, but liquid cooling is gaining ground.

When it comes to cooling capacity, liquid cooling far outperforms air cooling and is also highly efficient and cost-effective. In this article, we’ll look at two types of liquid cooling: immersion and conductive (direct-to-chip).

liquid cooling

Liquid Cooling: The Basics

Data centers are packed with racks on racks of servers, each one computing data at lightning speed. All that processing creates lots of heat. Liquid cooling exposes some or all of the servers to a coolant running through a network of pipes. When the liquid reaches the hot spots, it soaks up that excess heat.

The now-warm liquid then makes its way back to a heat exchanger, which transfers heat away from the liquid (typically to a water loop connected to a heat rejection system). With the heat dissipated, the liquid flows back to the equipment to repeat the cycle.

Liquid conducts heat far better than air, which makes this a much more efficient way to absorb excess heat. It’s more energy-efficient and consumes 10–50% less energy than conventional air cooling. But as we’ve noted, there are a number of different forms of liquid cooling solutions. Direct-to-chip (DTC) cooling—one of the most common methods—circulates liquid coolant through channels or cold plates that come into direct contact with the hot components, such as computer and graphics processing units (CPUs and GPUs).

On the other hand, immersion cooling fully immerses the IT equipment in a non-conductive liquid coolant. The coolant absorbs heat from all parts, not just specific components, and carries it away to a heat exchanger located outside the immersion tank. The thermal energy is transferred to a separate cooling system, such as a chilled water loop. After this, the coolant returns to the immersion tank, and the cycle repeats. However, it’s important to note that while this technique cools the hottest immersed components quite effectively, other components may still need supplemental air cooling.

Both methods are an efficient way to keep servers cool right at the source. But let’s explore the differences between these two methods to better determine which might be best for your needs.

liquid immersion cooling

Comparative Analysis: Immersion vs. Direct-to-Chip Cooling

Both liquid immersion and direct-to-chip cooling aim to tackle the heat generated by high-performance processors. But they have distinct features that make them unique. Here’s what you should consider when choosing the right solution for your data center:

Thermal Resistance

While both methods of cooling a dramatic imprvments over conventional chilled air, direct-to-chip cooling dissipates slightly more heat than liquid immersion cooling. This is because the fluid circulated through the direct to chip system is typically cooler than the fluid temperature in immersion cooling systems, providing lower resistance and transferring more heat away from components at the chip level. However, direct-to-chip cooling doesn’t cool other components such as the hard disk. You would need an additional means of cooling for other equipment—typically chilled air—negating many potential gains in operational efficiency.

Cost

If your organization currently has existing chilled air infrastructure in your data centers the initial setup costs for immersion cooling need to be considered, as you will need to invest in immersion tanks and coolant circulation systems. But in the long term, it will be worth the investment because it reduces energy consumption significantly, and if you are designing and building a data center from the ground up immersion’s simpler infrastructure requirements (no air handlers, chillers, in-row cooling, and so on) can dramatically reduce the capital expenses. Eliminating fans and insulating your IT equipment from dust, static electricity, condensation, and vibration can also extend the lifespan of your equipment.

Direct-to-chip cooling requires significant upfront investment in specialized equipment and infrastructure as well—coolant distribution units, cooling infrastructure to circulate fluid to each individual CPU or GPU, and so on. But, like immersion, it utilizes energy efficiently and reduces electricity consumption, delivering significant cost savings in the long term. However, because direct-to-chip only cools individual components, it doesn’t provide as dramatic a reduction in energy consumption as you still need supplemental cooling for the rest of your equipment.

Scalability

Both solutions are highly scalable and capable of cooling high-density deployments. Immersion tanks can easily incorporate additional servers or racks, allowing you to add capacity as needed without greatly increasing your compute footprint. Direct-to-chip cooling can also be integrated with additional individual processors, allowing you to leverage existing air cooled data center space and increase compute capacity without the need for significant space.

Maintenance

Immersion cooling systems require regular maintenance to check the integrity of the coolant and prevent contamination. You may also need to adjust the fluid levels periodically, such as when servers are added to or removed from an existing deployment.

Direct-to-chip cooling doesn’t need external coolant tanks. But this method also requires regular inspections and maintenance to ensure optimal performance.

Safety

The liquids used in immersion and direct-to-chip cooling are non-reactive and non-toxic. However, both methods do carry the risk of leakage—though the risk is greater with the direct-to-chip method. It has several touchpoints with your hardware, and a leak at any point can cause catastrophic damage and a complete shutdown.

cooling system for IT equipment

Future Outlook and Trends in Liquid Cooling

The demand for high-performance computing is rising, which calls for better solutions to keep data centers cool. With the integration of AI, next-generation AI data centers are poised for even bigger growth. There’s an insatiable appetite for AI given that it can handle intensive computer applications. With that, the processing load will increase even more.

GPUs are also becoming exponentially powerful with escalating energy demands to match. For example, Nvidia’s latest GPU enables “supercharged generative AI workloads.” It’s projected to consume a maximum power of 10.2 kW, 160% more than the previous version.

Fortunately, liquid cooling systems can meet this challenge. While still in its early-adopter phase, experts project liquid cooling will grow into a $1.6 billion industry by 2027. This means data centers are eager to make the switch to a greener solution to support their growth.

Several key players are already noting this disruptive technology. To return to Nvidia’s news above, they made the pivot early, realizing the gains—they launched their first liquid-cooled AI system in 2022, and have since said they would be using liquid to cool GPUs going forward. Now it’s time for other data centers to follow suit. In the coming years, growing technologies such as cloud computing, machine learning, and media streaming will also turn to liquid cooling to keep their applications running.

Stay Cool With Liquid Immersion Cooling

Data centers need to scale their technologies rapidly. And while this growth increases the heat load tremendously, Green Revolution Cooling (GRC) can meet the demand. We’re the pioneers of single-phase immersion cooling technologies. Our solution is vetted by IT giants such as Dell and Intel, and used by an international network of major clients.

Contact GRC today to learn how your data center can slash costs and grow its computing power easily and efficiently.


The Benefits of Immersion Cooling for Colocation Providers

Colocation data centers are feeling the heat, quite literally. Every large company that provides an online service or product relies on data centers to house their servers and IT infrastructure. They store and process millions of gigabytes of data, which heats up the equipment.

It may surprise you to learn that cooling colocation data centers requires as much electricity as running the actual servers. And as the demand for computing power skyrockets, these data centers will need to invest even more to keep their processors from heating up.

That’s where immersion cooling can help. This involves submerging IT hardware directly into a non-conductive liquid coolant, such as mineral oil or dielectric fluid. It’s much more energy-efficient and can cut down your costs significantly.

Let’s look at six reasons immersion cooling is a game-changer for colocation providers.

Energy Efficiency and Cost Savings

The demand for immersion cooling for colocation data centers is growing rapidly. Its market is projected to grow by $1.6 billion by 2030 for a very big reason: it slashes energy consumption drastically. Air cooling units can account for up to 40% of operational costs.

To put it in perspective, some estimates suggest that for every watt of electricity consumed by IT equipment in a data center, an almost equal wattage is required for cooling purposes.

colocation

Immersion cooling can address this significant energy consumption. Liquid-based cooling solutions absorb heat faster than air coolers. They also allow for targeted cooling, which is nearly impossible to achieve with air cooling systems.

There might be some upfront costs to get everything set up, but the return on investment (ROI) is worth it. In fact, immersion cooling can reduce energy consumption by up to 50%. You’ll have lower energy bills, reduced maintenance costs, and even potential tax incentives for going green.

Scalability and Flexibility

Growing businesses can easily expand server capacity with immersion cooling. Traditional air-based cooling systems might struggle to keep up with increased heat loads, but immersion cooling is quite flexible. Just add the new servers into the cooling liquid, and you’re good to go.

Colocation data centers can also mix and match different hardware configurations without worrying about cooling constraints. With immersion cooling, you have the freedom to customize your setup however you like. You can accommodate everything from high-density compute nodes to storage servers with ease.

What’s more, immersion cooling doesn’t rely on bulky air conditioning units or complex airflow ducts. So, you’re free to rearrange your racks and optimize your space as you see fit.

Reliability and Performance

When your clients count on you to keep their mission-critical applications up and running 24/7, there’s no room for mistakes or mishaps. Whether they’re running eCommerce websites, streaming services, or financial transactions, any downtime can be costly. Today, users expect lightning-fast speeds and seamless experiences. Anything less just won’t cut it.

Colocation centers need a rock-solid infrastructure that can guarantee reliable performance. That’s another benefit of immersion cooling.

It keeps operating temperatures optimal and ensures your hardware is running at peak performance levels. Traditional air-based cooling methods can struggle to keep up with fluctuating heat loads. But immersion cooling provides consistent cooling across all your servers, even during peak demand.

It also extends the lifespan of your hardware by minimizing thermal stress that can cause premature hardware failures.

Space Optimization

Massive air cooling units can take up valuable floor space. Plus, air-based systems require extensive ductwork for airflow management, which can be a logistical challenge. But immersion cooling tanks reduce the occupied space by two-thirds.

GRC cooling system in a colocation data center

These tanks can be installed anywhere with a reliable power source. The coolant in the tank is non-reactive and makes direct contact with the server components. This process dissipates heat efficiently and quickly.

Not only does immersion cooling require less space, but it also allows for a denser rack structure. Air-cooled units need plenty of room for airflow within servers. This isn’t an issue with immersion tanks. You can insert multiple server components in the tank to keep them cool.

Environmental Considerations

Traditional air-based cooling methods consume a lot of electricity. Immersion cooling, on the other hand, is green and can reduce your carbon emissions. It has a closed-loop system that recirculates the cooling liquid without waste.

This isn’t just good for the planet; it’s also good for business. As more and more companies prioritize sustainability, having an environment-friendly colocation provider can be a major selling point. Additionally, with tightening environmental regulations around the world, investing in green technology like immersion cooling is a smart move for the long term.

Security and Compliance

Security and compliance are non-negotiable for data centers. Clients trust you with sensitive information, such as financial records and customer information. Any breach could spell disaster.

Immersion cooling reduces the risk of physical breaches and unauthorized access to your servers. And because the cooling liquid is non-conductive and non-flammable, you can trust your hardware is safe from accidents and mishaps.

Immersion cooling also ensures your systems are up and running 24/7 to meet the stringent uptime requirements laid out in service-level agreements (SLAs). There are also fewer moving parts, which simplifies maintenance procedures. Overall, it’s easier to stay on top of your security and compliance obligations.

Invest in Colocation Immersion Cooling

Immersion cooling offers a myriad of benefits, making it a true game-changer for colocation data centers. It’s scalable, energy efficient, and aligns perfectly with the evolving needs of customers.

With this innovative technology, you can unlock new opportunities for growth, attract environmentally conscious clients, and future-proof your operations against rising energy costs. It’s an investment your customers, your bottom line, and the planet will appreciate.

For over a decade, Green Revolution Cooling (GRC) has been setting new standards for data center cooling and efficiency worldwide. We provide patented cooling technology that’s cost-effective, future-proof, scalable, agile, resilient, and efficient.

Are you ready to scale your data center to newer heights? Contact the GRC team today to learn more about we can help you meet your cooling needs.


AI and Data Centers: What Planning the Data Center of Tomorrow Looks Like Today

Artificial intelligence (AI) and data centers are converging as one of the technology industry’s key focal points. As resource-intensive AI technologies grow, data centers must adapt and find effective ways to manage the increased demand. Of course, this creates significant challenges, particularly for cooling needs.

Data centers are an increasingly important element of IT infrastructure. With commerce migrating to online channels at ever-rising rates, data centers play a central role in economic health and growth. As such, data center operators face pressing needs for efficient and cost-effective cooling solutions to avoid costly downtime, service interruptions, and potential damage to critical equipment.

These needs will only grow as artificial intelligence technologies continue to develop and integrate into the broader global economic framework. For context, consider these facts about AI and data centers:

  • AI’s processing demands draw three times the electricity required by conventional computing.
  • Data center operators face triple the typical energy costs to meet the energy requirements of AI deployments.
  • The density requirements of AI and data centers are 15 times greater than those for standard cooling capacities.

Given the challenges created by these factors, data center operators need effective and practical solutions. This is precisely what immersion cooling offers.

AI and Data Centers: Why Immersion Cooling Matters

Immersion technologies cool computing equipment far more efficiently than conventional methods like forced-air and water-based cooling. They also demand far less physical space and consume far fewer resources. As a result, data centers can cool more equipment at a lower cost, so they can meet the AI-related needs of today as well as tomorrow.

AI and data centers

At the same time, AI and data centers require careful integrational design, planning, and future-proofing. Data center operators must also consider cost effectiveness and the potential for a positive return on investment (ROI) as they plan, build, and choose technological systems for the data centers of the future.

Let’s look at the central considerations involved in data center operations in the age of artificial intelligence.

Current Landscape of AI and Data Centers

Artificial intelligence has been on the tech industry’s radar for decades. Until recently, it was considered a developing technology that would mature at some point in the future. That changed quickly and in dramatic fashion in late 2022, with the arrival of generative AI tools like ChatGPT and Stable Diffusion.

The incredible capabilities of emerging generative AI systems sparked a flurry of interest in artificial intelligence and machine learning (ML). Businesses across industries quickly developed plans to integrate AI into their operations. Investors flocked to both established and emerging players in the AI space, making them flush with investment capital. As a result of these influences, AI has exploded into the cultural, technological, and economic mainstream.

Emerging Use Cases for AI Technologies

AI is no longer relegated to the realm of developmental speculation. It’s now a fully functional aspect of contemporary computing, with a novel and fast-evolving set of exciting use cases. These broadly include:

Generative AI

Generative AI turns user-submitted prompts into original output. It creates text, images, music and sounds, video, and other forms of media. Commercial applications extend to many areas, such as:

  • Visual design
  • Content creation
  • Augmented reality (AR)
  • Virtual reality (VR)
  • Digital gaming

While generative AI still has its limitations, the technology can create surprisingly high-quality output. This is especially apparent in a class of technologies known as generative adversarial networks (GANs), which use advanced ML tools to edit and self-correct generated content.

Edge Computing

Edge computing is an IT architectural model in which a client or user’s data gets processed around peripheral points, known as “edges,” in the wider computing network. It’s emerging as an increasingly attractive option with respect to data centers, especially when paired with artificial intelligence technologies.

AI-powered processing algorithms make smart, efficient decisions about their use of edge computing resources. Among other areas, the convergence of AI and edge computing stands to impact the Internet of Things (IoT), mobile computing, and autonomous vehicle technologies.

Automation and Robotics

Businesses have moved quickly to integrate AI into their service channels. For instance, AI-powered chatbots offer an effective, cost-controlled way to answer customers’ questions. They can guide shoppers to appropriate products and services and help resolve issues through basic troubleshooting.

AI-powered automation and robotics tools are also reshaping the global manufacturing industry. Robots with integrated AI capabilities can quickly adapt and respond to new environments and working conditions. This makes them far more versatile and capable than legacy industrial robotics technologies.

Personalized Medicine

AI and ML models have the unique ability to analyze and draw insights from enormous quantities of data. Both can carry out these functions quickly and accurately, which opens a world of new possibilities in personalized medical treatments.

AI and data centers

Advanced computing systems can help doctors and healthcare providers diagnose conditions and diseases, select medications and treatments, monitor patient progress, and model health outcomes. Because AI-powered healthcare tools can also analyze patient data on mass scales, they stand to have a major impact in the public health arena.

Cybersecurity and Information Security

Cybercrime has a stunning economic impact, with one estimate showing that it cost the global economy $8 trillion in 2023 alone. That massive number is poised to continue rising. So, cybersecurity providers need powerful and effective new solutions to combat soaring crime rates.

Algorithms powered by AI technologies can detect suspicious activities and signals of an impending cyberattack with unmatched precision and speed. They can also mount effective responses to active threats and adapt to shifts and changes in cybercriminal activities while an attack is underway. When deployed alongside capable human cybersecurity professionals, AI creates a daunting buffer that can prevent damaging attacks outright.

At the same time, cybercriminals are expected to use AI technologies to make their scams and attacks more sophisticated. As such, AI-powered cybersecurity tools could be in a unique position to help human personnel identify and address these dangers and threats.

AI and Data Centers: Key Impacts

From an operational perspective, one of the main impacts of AI and data centers relates to computational requirements. AI deployments are extremely power-intensive. According to a recent study, the current wave of generative AI technologies uses 10 to 15 times as much energy as standard central processing units (CPUs).

Data processing requirements vary depending on the nature of the application, but they well exceed those associated with standard computing. In fact, one report found that AI’s data center density requirements were six to 12 times greater than established averages.

To meet the energy requirements and data processing demands of AI technologies, data centers will require both advanced and powerful computer hardware and energy-efficient management systems.

Additional data center impacts of AI relate to:

  • Decentralization: AI deployments demand low-latency data processing in real time. This has vaulted edge computing into the spotlight. It’s also activated nearby server networks and IoT-connected devices to manage data processing needs. These decentralized processing models are likely to become the norm, forcing IT infrastructure providers to reimagine their architecture.
  • Cybersecurity: IT infrastructure operators and data centers will both require AI-driven cybersecurity plans. AI holds the potential to power novel security capabilities, but bad actors can also exploit the tech for their own means. Industry observers believe AI will mark a major new cybersecurity battlefield in the years to come.
  • Network Automation and Optimization: AI’s forthcoming integration into network monitoring and management will automate many tasks and help optimize IT resources. But it will also contribute to data centers’ soaring energy needs and processing requirements.

As data centers plan for an AI-enriched computing future, immersion cooling has emerged as a powerful solution to several of the challenges operators currently face.

Immersion Cooling and AI

With respect to AI and data centers, immersion cooling offers three important and direct benefits. First, it helps power dramatic increases in rack density. So, data centers can pack far more processing power into the available space.

data centers

Second, immersion cooling offers performance advantages to the powerful, high-efficiency hardware and computing components required for AI applications. Third, immersion systems use far less energy than legacy forced-air and water-cooling technologies. These cooling methods are also impractical for AI because of their resource needs and spatial demands.

AI and Data Centers: How Immersion Cooling Supports Increased Rack Density

Liquid-based immersion systems deliver cooling directly to server racks, allowing data center operators to make much better use of available space. By comparison, forced-air cooling systems and other legacy cooling technologies are space-intensive. This negatively impacts rack density because it limits the amount of room available for hosting servers.

Green Revolution Cooling (GRC) made headlines in 2021 when we deployed our proprietary immersion cooling technologies to achieve extreme rack density. At the time, high-performance data centers posted densities of approximately 30 kilowatts per rack. GRC demonstrated modules capable of generating densities of 200 to 368 kilowatts per rack. That’s 12 times better than other high-performance technologies at the time.

Immersion Cooling Supports Higher Performance

Immersion cooling also offers multiple hardware performance benefits. Because it dissipates heat more efficiently than traditional air cooling, immersion cooling enables hardware to operate at lower temperatures. It also eliminates hot spots. Both of these features facilitate faster, more responsive, and more reliable computing functions.

The uniform nature of the cooling delivered by immersion systems has similar effects. Temperatures remain consistent throughout the hardware. This boosts overall performance while reducing the risk of a performance impairment known as thermal throttling.

Thermal throttling occurs when a CPU runs at hotter temperatures. When this occurs, the unit’s internal clocking mechanism slows down to prevent further heating and reduce the risk of overheating. This, in turn, prompts the CPU to run at lower speed. Immersion cooling reduces the likelihood of thermal throttling, so components operate at peak speeds.

Energy Savings and Immersion Cooling: Making AI Data Centers More Sustainable

AI technologies consume enormous quantities of energy. As they scale up and become more deeply integrated into the computing mainstream, data center energy requirements will rise in tandem. This creates both cost challenges and sustainability impacts.

Of the available data center cooling systems, immersion technologies hold the greatest potential to save the most energy. In fact, immersion cooling reduces energy consumption in multiple ways:

  • Immersion-cooled data centers don’t require the large, energy-intensive air conditioning and fan systems used in air cooling.
  • The liquid coolants used in immersion systems have much higher heat capacities than air. This enables them to absorb and remove heat far more efficiently and with less energy.
  • Immersion cooling minimizes energy waste by generating a lower overall power usage effectiveness (PUE) ratio.
  • Complex thermal management systems aren’t as urgently required in data centers that use immersion cooling, thanks to their ability to achieve stable and uniform operating temperatures.
AI data centers

Finally, immersion cooling facilitates higher ambient temperature operations without putting the safety or performance of cooled components at risk. This allows data centers to reduce their overall cooling-related energy expenditures, saving both money and resources.

Technological Requirements for AI Data Centers

Immersion cooling is one of two main types of liquid cooling relevant to AI and data centers. Direct-to-chip (DTC) cooling is the other. Data center operators should understand the technical differences between the two.

Types of Immersion Cooling

Immersion cooling uses two main models: single-phase (or one-phase) and dual-phase (or two-phase) cooling. Single-phase cooling is more common. It uses specially engineered coolant fluids, which remain in their liquid state throughout the cooling process. This fluid absorbs the heat generated by computing components and circulates it to an external system. The heat is then dissipated, eliminated, or sequestered for reuse.

Several distinct advantages have led to single-phase immersion cooling’s dominance. Single-phase systems have simpler designs, which enhances its simplicity and reliability. They are also energy-efficient, easier to implement, and compatible with a wide range of IT hardware.

Direct-to-Chip Cooling

Also known as liquid cooling or water cooling, DTC cooling delivers specialized coolant fluids directly to a hardware unit’s heat-generating chip components. The fluid absorbs heat and transfers it away from the computing components to a cooling block or heat sink.

DTC cooling has two considerable drawbacks. First, it carries a risk of leaking coolant directly into sensitive electronic components, which can damage or destroy hardware. Second, it has only limited heat dissipation coverage because of its sole focus on cooling a narrow and specific set of components. So, it has significant limitations in high-density data centers, which require the uniform cooling of multiple components to operate at peak efficiency.

Beyond choosing a cooling technology, AI and data centers have additional technical requirements. Major examples include data processing and storage needs, plus the logistical challenges associated with meeting user demands for AI technologies.

AI and Demand for Data Storage and Processing

One of the most disruptive aspects of AI and data centers relates to demand volatility. The global data center industry has already had significant problems creating accurate capacity projections and planning models over the past decade.

Industry analysts expect AI to intensify this volatility to a significant degree. After all, emerging applications have already shown an ability to draw in large user volumes in very short periods of time.

Ai technologies

Predictive analytics, which are powered by AI technologies, can help data center operators plan to meet storage and processing requirements. While precise capacity requirements remain unclear as AI technologies continue to emerge, data center designers should uniformly consider them to be significantly higher than present levels. Data centers currently considered to have extreme density profiles could become the norm in relatively short order.

Logistical Requirements for AI and Data Center Integrations

For corporate enterprises, legacy approaches to data center planning typically involve building and maintaining large-scale data centers exclusively for their own needs. This model remains common in many industries, especially those subject to elevated compliance and data protection requirements. Examples of such industries include insurance, financial services, and healthcare.

Yet major businesses in these and many other industries are increasingly adopting software-as-a-service (SaaS) models. SaaS has become a dominant aspect of cloud computing, especially since the COVID-19 pandemic.

During the pandemic, a general shift occurred in which large businesses began migrating from private data centers to cloud and multi-cloud models. This approach, known as colocation, holds a strong appeal for businesses that make extensive use of AI and data centers. Specifically, colocation offers:

  • Comprehensive network connectivity
  • Low-latency and ready access to advanced, high-performance computing networks
  • Reduced data transfer times
  • Excellent scalability

As AI becomes more deeply entrenched in everyday computing, it will also force data centers to adopt more advanced and specialized forms of hardware. These computing systems are larger and more powerful than their conventional counterparts, and they also generate far more heat. As such, immersion cooling is rapidly emerging as an essential part of future-proofed, AI-compatible data centers.

Planning and Design Considerations

Immersion cooling offers unique advantages in relation to AI and data centers. For one thing, it liberates designers from the need to accommodate the space-intensive infrastructure demanded by forced-air cooling and other legacy solutions. As a result, it allows for significantly more flexibility in deploying AI solutions.

Data centers have traditionally favored the use of forced-air cooling systems, so those will serve as the main point of comparison. Planning and designing a data center that uses forced-air cooling solutions limits the range of available facility options. That is, you need buildings that have the space and ventilation infrastructure to accommodate massive fan networks and air conditioners. In the absence of such an option, the data center facility would require costly and time-consuming retrofitting.

cooling technologies

In contrast, integrating immersion cooling into data centers at the design level opens up many alternative possibilities. A facility only requires a small handful of physical features to accommodate an immersion-cooled data center. This includes a water loop system, along with access to electricity and a computer networking infrastructure. In this manner, immersion cooling supports what’s known in the data center industry as deployment location flexibility.

Immersion Cooling, AI and Data Centers: Physical Layout and Space Allocation Considerations

To integrate immersion cooling systems into data centers at the design stage requires a careful analysis of multiple factors related to immersion tanks and their infrastructure. These primarily extend to:

Dimensional Considerations

Designers must generate accurate estimates of the hardware sizes and quantities that will be housed in the data center. This, in turn, allows them to project the dimensions of the immersion tanks that will be required to cool them.

Tank Placement

In planning the placement of immersion tanks, designers must consider factors such as:

  • Floor space requirements and availability
  • Maintenance access
  • Proximity to power supplies

Cooling Fluid Circulation Paths

Liquid coolant circulation paths also demand careful planning to ensure heat is transferred efficiently from the computing components to the cooling fluid. The paths require optimization to deliver uniform levels of cooling to all components. Additional considerations include flow direction and strategies for managing potential hot spots.

Safety Clearances

Regulatory requirements stipulate that immersion tank placements account for safety clearances. Tanks must be positioned to allow emergency crews to access the facility and to minimize or eliminate the risk of accidental contact by site personnel.

Electricity Infrastructure

Planning must account for the placement of the power sources that will serve both the immersion tanks and the computing units. These power sources must occupy locations that are safe and accessible.

planning data centers

Sensors and Monitoring Systems

Immersion cooling systems require the precise, round-the-clock monitoring of tank conditions, coolant temperatures, and overall performance. So, designers need to consider the placement of sensors and monitoring equipment during the initial stages of facility planning.

Expansion and Scalability

Experts widely project AI and data centers to grow at dramatic and interlinked rates in the years ahead. If space permits, facility designers should also address scalability and the possibility of future expansion into their site planning.

Cost Effectiveness and ROI

Cost effectiveness and ROI drive many decisions that impact AI and data centers. First and foremost, businesses must consider the fact that AI deployments are very expensive. They need to maximize their cost effectiveness across every other aspect of the related operations.

Businesses using AI technologies to drive their revenues must address these considerations with added attention and urgency. In these cases, individual components in the AI deployment must be running at peak efficiency on a 24/7/365 basis. After all, they’ll play an ongoing and critically important role in generating income for the enterprise.

To those ends, it’s important for businesses to consider the many ways in which immersion cooling technologies support cost savings with respect to AI and data centers. These include:

Energy Efficiency and Reduced Electricity Costs

Data centers that incorporate immersion cooling as their primary heat management strategy use far less energy than their legacy forced-air counterparts. In addition to dramatically reduced electricity requirements, immersion cooling also uses energy more efficiently.

Consider the following statistics:

  • Cooling accounts for approximately 40% of the typical data center’s total energy consumption.
  • Air cooling only captures about 30% of the heat emitted by servers.
  • Immersion cooling functionally captures 100% of that heat.

What’s more, immersion cooling enables data center operators to remove the internal server fan networks used in forced-air systems. This alone accounts for an energy load reduction of 10–15%.

AI energy

A 2023 study published in Energy Informatics reported that immersion cooling holds the potential to reduce overall data center electricity consumption by up to 50%. That translates into enormous operational cost savings over the lifespan of a typical AI deployment.

Superior Heat Dissipation Capacity

Compared to forced-air cooling systems, immersion technologies have much higher heat dissipation capacities. The superior efficiency can reduce or even eliminate the need for adjacent cooling solutions, which further reduces costs.

Space Optimization

Data center operators stand to generate additional savings through space optimization. Immersion cooling supports more compact server designs and layout placements. Data centers can then make smarter and more efficient use of limited floor space.

As the data center’s physical footprint diminishes, so do its operating costs.

Higher Density

Density marks one of the most pressing considerations impacting AI and data centers. The current generation of AI technologies already demands far more density than conventional computing. These density needs will only intensify as artificial intelligence advances and becomes more complex.

Immersion cooling lets data centers pack more computing power into less space. The per-unit cost of computational power falls as a result, giving immersion cooling a far superior cost profile relative to available alternatives.

Increased Lifespan of Computing Hardware

Immersion-based heat management systems allow servers and other forms of computer hardware to operate at lower temperatures for longer periods of time. This reduces the overall stress levels, extending their lifespans and helping operators delay or avoid costly replacements.

What About Capital Investment?

The powerful cost-saving benefits offered by immersion cooling come with caveats. One of the most significant examples relates to up-front capital investment requirements. While they’re much cheaper to operate, immersion cooling systems are relatively expensive to build and implement compared to forced-air strategies. However, those higher up-front investments deliver powerful returns that extend beyond the aforementioned long-term savings on operating expenses and equipment costs.

As AI and data centers proliferate, data center industry experts predict that forced-air cooling systems will become increasingly rare before growing obsolete. So, businesses and data center operators should closely consider whether it makes sense to commit significant amounts of investment capital to cooling technologies that are likely to suffer from declining levels of utility in the near-term future.

An investment in immersion cooling is also an investment in future-proofing. In many cases, it makes more sense for data centers and businesses to invest in immersion technologies now. Doing so may well help them avoid expensive retrofitting projects or facility upgrades in the future.

AI data centers

Future Trends and Innovation

Planning for the AI and data centers of the future demands a careful evaluation of emerging and evolving trends and technologies. To that end, businesses with AI-adjacent operations and data center operators need to consider impending advancements in artificial intelligence. Then, there are also upcoming immersion cooling innovations specifically designed to meet the evolving needs of AI workloads.

AI Technology: Upcoming Advancements and Their Data Center Impacts

Generative AI tools like ChatGPT and Stable Diffusion are only the tip of the iceberg when it comes to where artificial intelligence is heading. In January 2024, TechTarget published a review of how AI technologies are likely to develop during the remainder of the 2020s. The article cited strong, direct impacts on five key industries:

  • Education: Experts predict educational content will become heavily personalized and integrated into custom learning plans for individual students.
  • Finance: AI is already having a major impact on investing. Specifically, it powers automated trading algorithms and helps traders and investors select investments, manage risks, and execute strategies. As it expands, AI will likely also redefine financial planning, insurance, fraud detection, and consumer credit monitoring.
  • Healthcare: Doctors and nurses are expected to integrate AI into diagnostics, treatment planning, and patient monitoring at increasingly robust rates. Predictive AI technologies may also be used to anticipate potential health problems in individual patients. They’ll likely play an increasingly high-profile role in protecting patient data and privacy as well.
  • Law: AI technologies could displace lawyers from the labor force in huge numbers as attorneys make increasing use of artificial intelligence to conduct research, draft contracts, and plan litigations and legal arguments.
  • Transportation: AI-powered smart grids have long been poised to transform urban transportation systems. Those technologies are finally on the cusp of entering the mainstream. Artificial intelligence is also expected to power advancements in autonomous vehicle technologies, which to this point have remained mired in the middle stages of their development.

These are but a few of many illustrative examples of how AI could become increasingly integrated into everyday life at accelerating rates. This, in turn, creates multiple additional considerations for data center designers and operators.

Density Demands Will Continue Rising

In November 2023, Silicon Valley Power projected that data center loads would nearly double from current levels by 2035. Achieving much higher compute densities appears to be the only way data centers will be able to keep up with future user demands.

Immersion systems optimize space and offer scalability without the need for additional temperature control infrastructure. As such, they pose the clearest path to higher data center densities of any cooling technology currently on the market.

Retrofitting Existing Data Centers

Major tech industry players like Meta and Microsoft are already reimagining their data center operations with AI in mind. As the challenges associated with AI and data centers continue to intensify, many tech insiders believe a wave of facility retrofitting projects will sweep across the industry.

Retrofitting data centers to accommodate immersion cooling is a complex and expensive process. However, the associated capital investments could become unavoidable. Legacy cooling technologies will likely diminish to the point of impracticality as AI and data center workloads soar in the coming years.

Immersion Cooling Will Go Mainstream

Given the vast scope and fast pace of change that AI will force, immersion cooling is likely to emerge as a standard feature of next-generation data centers. Of course, the technology is in its relative infancy and is currently considered somewhat exotic. But impending innovations are poised to guide its continued growth as AI workloads continue to intensify in data centers globally.

Immersion Cooling Innovations for AI Workloads

Tech-oriented businesses continue to actively seek AI angles in an ongoing bid to capitalize on the technology’s promise. Such efforts mark a significant driver of the continued growth and evolution of AI workloads.

data center technologies

Immersion cooling technologies are also advancing to meet the data center industry’s changing needs. Some particularly notable emerging developments and innovations include:

Expansion Into Multi-Tenant Colocation Centers

Many data center industry experts believe that the relationship between edge computing, AI, and data centers will spark significant growth in immersion cooling’s adoption rates. While this trend isn’t directly related to immersion cooling technology, it does offer meaningful insight into a potentially major driver of its growth.

Edge computing is mainly used in multi-tenant colocation centers, which are typically situated in or on the periphery of the cities where tenants are based. It facilitates the processing of client data closer to its origin point. This reduces latency, improves bandwidth efficiency, and boosts the performance of applications that require fast data processing speeds. Edge computing also offers privacy, security, reliability, and resilience benefits.

Conventional forced-air cooling systems aren’t feasible for edge deployments. Edge computing facilities are typically located in urban areas, where space is at a premium. It’s difficult to accommodate the large and resource-intensive infrastructure that air-based cooling strategies require.

What’s more, dust and other particles tend to be a bigger problem in data centers located in urbanized areas. Air cooling circulates these contaminants around facilities, creating performance risks and the potential for equipment damage. Large-scale air filters offer a solution, but they’re very resource-intensive to clean and maintain.

For these reasons, industry insiders widely consider immersion cooling to be a much better option for edge deployments. With AI and data centers poised to drive huge growth in edge-enabled colocation facilities, immersion technologies could see huge growth as a result.

Forced Convection Heat Sinks

Industry innovators recently paired forced convection heat sinks with single-phase immersion cooling. In doing so, they achieved significant performance improvements. This also foretells another potential direction that liquid cooling’s ongoing advancement might take.

Legacy-forced convection heat sinks use air, which is circulated over the heat sink’s surface to accelerate the rate of heat transfer. But engineers noted that the specialized fluids used in immersion cooling can absorb up to 1,000 times as much heat as air. So, they adapted the standard forced convection heat sink model to integrate liquid coolants.

The promising results broke established performance barriers. It’s a clear sign of how system design, engineering innovation, and emerging coolant technologies can come together to take immersion cooling to new heights of utility and feasibility.

AI and Data Centers Are Evolving. Are Your Cooling Solutions?

AI deployments are poised to put immense pressure on data center infrastructure. Major improvements in rack density represent one of the only feasible solutions, but conventional cooling methods simply can’t achieve it.

Data centers need to evolve, and innovation is a pressing necessity. Immersion technology offers an ideal solution to the cooling needs of the data centers of tomorrow. It directly supports extreme data center density by optimizing space. At the same time, it supports faster, more efficient cooling and the advanced hardware performance that artificial intelligence demands.

GRC has gained widespread industry recognition as a leading force in immersion cooling technology. Our immersion systems can help dramatically improve your data center’s performance and data processing capabilities. To discuss your cooling needs in detail, contact us today.


Forecasting Data Center Immersion Cooling Technology for the Year Ahead

Artificial intelligence (AI), Internet of Things (IoT) technology, and other advancements are fueling a need for more data centers. But this need brings challenges. For one, the new data centers we add to power our digital world will also give off excessive amounts of heat. Plus, the technology needed to cool the centers can eat up a lot of resources. This is where solutions like immersion cooling are coming to the rescue.

Immersion cooling submerges computer hardware, such as server units, in a specially engineered, non-conductive dielectric liquid. The liquid absorbs excess heat to stabilize the hardware within a safe temperature range. Then, the system eliminates the heat by routing it into a heat exchanger.

Data center operators continue to adopt immersion technologies at robust rates, thanks to their many benefits:

  • Superior Cooling Efficiency: Dielectric liquids conduct heat more efficiently than air. This enables them to remove more heat from server components and do so with greater speed and efficiency.
  • Energy Savings: Immersion systems use significantly less electricity and water than their competing counterparts. These resource savings are critically important to data center operators who must balance cooling needs with sustainable energy consumption.
  • Space Optimization: In most data centers, space is at a premium. Immersion cooling systems take up relatively little room and don’t need elaborate ductwork or extensive site retrofitting. They help reduce and control the data center’s spatial footprint while making the most of the available facilities.

Data center operators are increasingly attuned to these advantages, and immersion systems are becoming more common. Even better, the year ahead seems poised to deliver some exciting advancements.

The Current State of Immersion Cooling

Current immersion cooling technologies share several defining features. First, they use specialized dielectric coolant fluids. These fluids don’t conduct electricity and won’t damage sensitive computer components.

concept of a data center that might use immersion cooling

Hardware is placed in specially designed fluid-filled tanks or other engineered enclosures and then fully immersed in these fluids. Monitoring tools and control systems track temperatures and internal conditions. They then generate reports and alerts that technicians can use to maintain system safety.

These foundational elements form the basis of current approaches to immersion cooling. Of course, there are some variations. Single-phase immersion cooling is a common and widely used example.

In single-phase cooling systems, dielectric fluids absorb heat from immersed computing components. To prevent these liquid coolants from reaching their boiling point and changing phases into gas, single-phase systems deploy a heat exchanger mounted in a cooling distribution unit. This exchanger works to keep the fluid cool enough for it to retain its liquid form, which simplifies the system and improves its operational efficiency.

Immersion Cooling Fluids

Another variation extends to the different fluids used in immersion systems. These fluids come in two main classes: engineered fluids and oils.

Engineered fluids primarily use fluorocarbons, hydrofluoroethers, or perfluoropolyethers. In general, these fluids offer high levels of stability. They’re also compatible with a broad variety of systems and can even be customized.

Oil-based cooling fluids include synthetic, mineral, and biological products. Like engineered fluids, oil-based fluids have high heat capacities and help distribute heat evenly to prevent hot spots. However, they tend to function best within narrower temperature ranges and often have performance limitations in extreme temperature ranges. Many are also derived from fossil fuel sources, which introduces environmental and sustainable energy concerns.

cooling towers with pipes of liquid
Photographer: John_T

Important Considerations for Current Immersion Cooling Technologies

While immersion cooling offers several major advantages over legacy technologies, data center operators must also consider multiple factors. For instance, immersion cooling systems require significant up-front investments. But over the long term, data center operators can recoup those costs (and then some) in the form of efficiency improvements and electricity savings. Still, you’ll need to account for the initial costs when you budget.

Additional factors to plan ahead for include:

  • Maintenance: The coolant fluids used in immersion systems require continuous monitoring and regular replacement. You’ll need to train technicians who understand how to test and change coolant fluids, or hire ones that already know.
  • Availability: Immersion cooling is still a growing field. Depending on your location, it might take some work to find a local provider with the technical knowledge and expertise to manage your data center’s conversion.
  • Site Logistics: Immersion cooling systems need careful maintenance to guard against leaks. In properly managed systems, leak risks are extremely low. However, you’ll need to commit additional resources to system monitoring to ensure no adverse events occur.

Technological Advancements on the Horizon in 2024

As immersion cooling technologies continue to develop, engineers are making steady progress toward high-impact improvements and refinements. Examples include hardware innovations, directed flow technologies, novel cooling fluids, and new approaches to system design.

Artificial Intelligence

AI technologies are on the cusp of dramatically transforming computing and its capabilities. But AI has intensive hardware and data requirements. This creates some important functional limitations because it consumes data center resources in large quantities. Yet AI could soon prove to be a vital part of the solution to its own problem.

Data centers already use AI and machine learning (ML) technologies for predictive maintenance, regulatory compliance, problem detection, and dynamic forms of system monitoring. The ability of AI and ML technologies to supplement the work of human technicians provides a powerful addition to their management capabilities. They can better manage resources and help data centers maintain an optimal balance of safety and performance.

illustrated concept of a human and AI working together

These features, in turn, support the scalability of the high-density data centers that AI will more often require as it rapidly grows. As data centers increasingly integrate AI, their efficiency and performance capabilities will improve. They’ll then be better able to handle the accompanying rise in data processing and storage needs.

Directed Flow Technologies

One of the clearest examples of immersion cooling’s effectiveness relates to what’s known as directed flow. Also known as enhanced flow, directed flow uses propulsion systems like turbines or jets to force cooling fluids to move faster over the surface of computing components.

As the fluid moves more quickly, it’s able to draw heat out of the immersed computing components at a faster rate. This improves both the efficiency and the effectiveness of the cooling system.

Immersion cooling systems already use directed flow technologies to positive effect. As the benefits of enhanced flow become more apparent, systems will likely incorporate them at the design level with greater frequency. This, in turn, will support further data center performance improvements, especially as engineers develop more precise and efficient directed flow systems.

Novel Immersion Cooling Fluids

Inspired by a desire to produce a sustainable coolant with an environmentally friendly profile, TotalEnergies Fluids has developed a signature line of BioLife Immersion Cooling Fluids. These fluids are manufactured with 100% traceable feedstocks sourced exclusively through the reuse, recycling, and regeneration economy. As a result, BioLife Immersion Cooling Fluids deliver the same elite performance as coolants made from natural or synthetic hydrocarbons but without the associated environmental impacts.

The BioLife fluid line is fully certified under the ISCC PLUS sustainability certification program. It also offers outstanding safety and stability profiles, very low viscosity, and comprehensive computer hardware compatibility.

All BioLife Immersion Cooling Fluids are approved for use in Green Revolution Cooling (GRC) systems. Visit the GRC ElectroSafe Fluid Partners page for more information.

Technology-Specific System Design

IT vendors have started to integrate immersion cooling into data centers at the design level. They’ve also engineered new ways to convert and retrofit existing data centers to more readily adopt immersion technologies. These advancements even extend to the level of computing hardware, which has historically been designed with air-based cooling in mind.

Hardware manufacturers have long followed a standard practice of grouping computing components that tend to generate the most heat in a narrow strip of internal space. This is done to facilitate the rapid movement of forced air over those specific components, which theoretically helps keep them cooler.

However, manufacturers are increasingly reconfiguring their designs to separate those high-heat components as much as possible. This is being done under the assumption that the servers will be cooled through immersion technologies. The physical separation of the system’s hottest components generates lower overall levels of ambient heat. As a result, the immersion cooling system can more readily absorb and eliminate that heat.

Immersion Cooling Industry Trends and Projections

As a whole, the industry for immersion cooling technology is expected to continue growing. A demand for more energy-efficient cooling solutions is behind the growth.

illustration showing projected growth of immersion cooling

Growth Forecasts

As immersion cooling technologies continue to advance, so does its market share. Citing a projected 2024 global market value of approximately $780 million, Mordor Intelligence projects the value of the immersion cooling industry to soar to $2.34 billion by 2029. If accurate, the increase would amount to a stunning compounded annual growth rate (CAGR) of 24.42% over that five-year period.

The Mordor Intelligence report cites several critical factors driving the growth:

  • The COVID-19 pandemic was a major catalyst for the initial spark of immersion cooling growth that occurred during the early 2020s.
  • Thanks to advancing technologies, immersion cooling has become more scalable, less costly, and easier to maintain.
  • Data centers have increasingly sought sustainable energy solutions that will lower their carbon footprints and diminish their reliance on and consumption of resources like water and electricity.

Mordor Intelligence believes North America will emerge as both the largest and fastest-growing market for immersion cooling technology during the remainder of the 2020s. Notably, the analytics agency cites Green Revolution Cooling (GRC) as one of the burgeoning industry’s major players.

Additional Growth Drivers and Catalysts

Industry analysts expect two notable growth drivers to power the continued adoption of immersion cooling: cryptocurrency mining and sustainability.

Cryptocurrency Mining and Blockchain Technology

With cryptocurrency values remaining both elevated and volatile, the asset class has drawn sustained interest from both investors and traders. Data-intensive mining operations work around the clock to meet the strong global demand for cryptocurrency supply.

Immersion cooling offers an excellent solution to the accompanying challenges. These extend more broadly to the blockchain technologies that are becoming more common as the push for internet decentralization intensifies.

Sustainability

In 2022, Techspot estimated that data centers combine to consume more than 205 terawatt hours of electricity each year. That’s more than the annual electricity consumption of countries like Denmark, Ireland, and South Africa. According to prevailing growth trends, the global technology industry could account for as much as one-fifth of worldwide energy consumption by the dawn of the 2030s.

These eye-popping numbers have triggered alarm among the fast-rising number of technology industry players concerned with the sustainability of data center operations. Since immersion cooling can remove heat from servers up to 1,200 times more efficiently than air cooling, it has drawn obvious attention as a sustainable solution.

In fact, a 2023 study published in Energy Informatics found that immersion-cooled data centers use about 50% less energy than their conventional air-cooled counterparts. This gives immersion cooling a growth-driving sustainability advantage.

Challenges and Solutions

Technology is dynamic by nature. As such, new challenges often appear almost as soon as existing ones are meaningfully addressed. This is certainly the case with data center cooling, especially in the impending age of AI.

aerial view of large data center site

AI, ML, and advanced analytics are all highly intensive applications with respect to their data processing and storage needs. They already put an increased strain on conventional server racks, forcing them to generate more heat and consume more resources and electricity than ever.

To address these processing needs, data centers are increasingly moving toward high-density models. High-density data centers pack greater concentrations of computing power into smaller and smaller spaces. They represent a confluence of careful, innovative design and hardware engineering ingenuity.

Immersion cooling is critical to data center density, as it demands far less space than other cooling options. These cooling systems enable data centers to pack more computing power into smaller areas, boosting their density capabilities. This also makes it more viable to locate data centers closer to the urbanized areas that drive data processing demand.

Liquid immersion cooling simultaneously reduces the carbon footprints of data centers. In air-cooled data centers, cooling needs account for about 40% of overall energy consumption. This total is far lower in immersion-cooled data centers. So, it’s easy to see why immersion cooling appeals to the rapidly growing number of businesses that want to improve their sustainability profiles.

The Takeaway: Immersion Cooling Has a Promising Future

Despite any perceived challenges, immersion cooling solves many of the pressing logistical, environmental, and performance-related concerns associated with conventional cooling solutions. For instance, it uses far less electricity and fewer water resources. What’s more, its drastically reduced spatial requirements make it suitable for use in the high-density data centers of tomorrow.

Immersion cooling also offers an ideal solution to data center operators who seek a reliable, secure, and effective way to optimize performance and minimize risk. Servers and computing components cooled with immersion methods tend to maintain faster and more efficient performance for longer periods of time. This reduces downtime and helps data center operators comply more readily with the terms of their service-level agreements (SLAs).

GRC is a recognized leader in the fast-growing global immersion cooling market. We provide advanced immersion cooling systems specifically designed for rack-based, modular, and blockchain applications. Contact us today to discuss your data center cooling needs in detail.


Why Density Will Become the Most Important Metric for Data Center Cooling

Critically important computing equipment runs on a 24/7 schedule in data centers. Out of necessity, these centers consume large quantities of energy and generate a great deal of heat. Data center cooling technologies and strategies can eliminate excess heat before it can damage sensitive systems and components.

High-density data centers concentrate more computing performance into smaller spaces. Although they offer significant performance and efficiency benefits, they also create unique logistical challenges with respect to heat removal. Fortunately, high-density data center operators have multiple advanced cooling technologies to choose from. Of these, single phase immersion cooling is an extremely attractive option, thanks to its logistical ease and sustainability.

What Are High-Density Data Centers?

In data centers, the term “density” refers to the amount of electricity consumed per square foot of internal space or per server rack. As density increases, higher volumes of computing power are concentrated into smaller areas. For data center operators, high-density facilities fill an obvious logistical need. Namely, they extract more performance from limited amounts of physical space, thereby delivering superior efficiency and cost savings. Going head-to-head with air cooling, immersion cooling was reported to cut space occupied by some two-thirds.

Data center cooling technician
Source: Shutterstock

At the setup stage, high-density data centers require specialized configurations. Operators may, therefore, need to make substantial investments to modify or convert data center facilities for high-density applications. However, once they are up and running, such data centers offer numerous benefits:

  • They are scalable and have excellent space efficiency, giving operators competitive advantages.
  • Concentrated computing power leads to a reduced infrastructure footprint.
  • Lower overhead and infrastructure costs can generate long-term cost savings.
  • Faster query responses and processing larger quantities of data more quickly lead to superior performance.

At the same time, high-density data centers have more complex cooling needs. They require carefully optimized and strategically planned cooling solutions to maintain performance and safeguard sensitive equipment and hardware.

What Data Center Cooling Challenges Do High-Density Facilities Face?

All data centers demand carefully controlled cooling solutions. Without this safeguard, the heat these centers generate can cause performance efficiency losses or damage servers, their hardware, and the data they contain. Cooling needs increase proportionally with density: as power density rises, cooling capacity needs rise alongside it. This is because higher computing densities generate greater quantities of heat, which then become concentrated within a smaller physical space.

High-density facilities also face distinct data center cooling challenges because of numerous factors, including:

  • Complexity. High-density data centers require meticulously planned designs and intricate physical layouts. The logistics involved in achieving the desired level of computing density often make traditional solutions, such as air cooling, impractical or inadequate.
  • Airflow. Equipment and servers in high-density data centers are packed into complex arrangements, which can disrupt or limit airflow. As a result, localized concentrations of ambient heat, known as “hotspots,” can easily form. This calls for specialized data center cooling strategies.
  • Costs. Although high-density data centers offer some cost benefits with regard to infrastructure and physical efficiency, these can be offset by elevated cooling costs.

Sustainability is another concern. High-density data centers consume large quantities of electricity, and those reliant on water cooling also place significant strain on local resources.

Metrics to Consider When Choosing Data Center Cooling Solutions

Because high density delivers superior performance by packing more computing power into a limited physical area, the industry continues to evolve toward ever-higher density profiles. In selecting data center cooling solutions for high-density facilities, operators must consider metrics across three key classes:

  • Quantitative metrics. Factors such as power usage effectiveness, the cooling capacity factor, and the cooling energy efficiency ratio generate hard data that can be used to compute cooling needs.
  • Qualitative metrics. Indicators including thermal comfort index, cooling failure rates, and the localized cooling regulatory index offer performance insights for site operators. These are then used to formulate cooling strategies and optimize cooling system performance.
  • Site-specific factors. Operators must also consider a data center’s physical features, such as raised floor bypass open areas, perforated tile placements, and bypass airflow rates.

Effective Cooling Solutions for High-Density Data Centers

Operators of high-density facilities can choose from multiple data center cooling options capable of successfully managing site-specific cooling needs. The right solution for any given data center depends on a combination of site-specific factors, quantitative metrics, and qualitative indicators. Examples of effective solutions include:

  • Direct-to-chip cooling. Also known as single-phase immersion cooling, direct-to-chip methods involve physical contact between the surface of a server’s processing unit and a water or water-glycol liquid coolant.
  • Microchannel liquid cooling. Microchannel liquid cooling also uses a water or water-glycol coolant. This is pumped through tiny tubes within a cold plate positioned directly atop the surface level of a server’s processing components.
  • Calibrated vector cooling. With a combination of liquid and air cooling, calibrated vector cooling (CVC) uses air to remove ambient heat while physically applying liquid coolants to components and equipment that generate extremely large quantities of heat.
  • Rear-door heat exchange. Both passive and active rear-door heat exchange systems use complex fan systems to draw heat out of the server racks and replace it with liquid-cooled air.
  • Immersion cooling. As its name suggests, immersion cooling involves the physical immersion of servers in a specially formulated liquid coolant. The cooling solution neutralizes heat without affecting computing performance, improving energy efficiency while eliminating the need for cooling methods based on air exchange.
Data center cooling technology
Source: Shutterstock

Advantages of Immersion Cooling

Immersion cooling holds several distinct advantages. It offers major electricity savings, with analyses finding that it reduces energy consumption by around 50% compared with air cooling. This frees up power allowing it to be used more productively to increase compute capacity.

This also dramatically improves its sustainability profile while facilitating precisely targeted cooling without the need to reconfigure site-specific plumbing or ventilation systems.

Modular liquid immersion cooling systems let site operators make substantial improvements to the density profiles of their facilities without the need to implement major modifications or investments. Data center operators can also select technologies that deliver purpose-built immersion cooling for blockchain applications.

Connect With Sustainable Next-Generation Data Center Cooling Solutions

As a data center cooling solution, immersion cooling offers exceptional promise thanks to its practicality, cost-effective implementation profile, and sustainability advantages. It represents an effective and affordable solution to the specific cooling challenges high-density operators face as global computing needs continue to rise at exponential rates.

High-density data centers will become increasingly important as data-intensive technologies like artificial intelligence (AI) and the blockchain are integrated into the computing mainstream. Green Revolution Cooling (GRC) is an authoritative provider of high-performing immersion cooling systems that meet the growing needs of modern data centers.

Get in touch with GRC to learn more or to discuss your site-specific data center cooling needs.


What the Advancement of Immersion Cooling Will Look Like in the Coming Year

Immersion cooling is a precision technology used in data centers as a supplement or alternative to traditional air cooling. It involves immersing servers in specially engineered dielectric fluids that cool the submerged units and maintain their peak performance. As the fluids used in immersion technology do not conduct electricity, they pose no risk of damaging computer components.

For all its promise and practicality, moving to immersion cooling still presents some challenges. Though it delivers long-term cost savings by dramatically reducing electricity usage, immersion systems may require additional infrastructure investment, and changes to existing maintenance processes may be needed.

Fortunately, there are several innovations and data center advancements on the horizon that will address these and other lingering issues.

Advancements Poised to Transform Immersion Cooling in the Near Term

Three key immersion technology advancements are ready to make an impact in the near-term. These include replacement heat sinks, alternate thermal interface materials, and colder fluids.

Replacement Heat Sinks

Data center heat sinks have traditionally been air-cooled. However, Green Revolution Cooling (GRC) has worked with multiple partners to develop alternative components for air-cooled servers to enhance their performance when immersed. The result is a specially engineered heat sink that incorporates immersion cooling and has succeeded in cutting thermal resistance in half. Testing showed a 100% performance improvement, signaling significant market potential for the novel heat sink design.

Alternate Thermal Interface Materials

Most immersion fluids on the market use foil-based thermal interface materials (TIMs), such as Indium. Although the current generation of TIMs outperforms conventional data center cooling systems in many ways, room for performance improvements remains. This is especially true for the thermal resistance characteristics of current TIMs.

Innovators have already engineered alternative TIMs with superior thermal resistance and performance specifications. In one recent test, an emerging TIM displayed a 25% maximum wattage improvement during testing on high-performance Intel processors.

AI and immersion cooling
Source: Shutterstock

Colder Fluids

Immersion systems are generally capable of delivering effective cooling without the need for fluid chillers. That said, in some data centers, unrefrigerated fluids have limited practicality. For instance, ambient T-case temperatures come close to levels that cannot effectively be cooled without the use of fluid chilling systems.

Notably, most data centers have built-in water-chilling systems. This opens up retrofitting possibilities involving the engineered cooling fluids used in immersion systems. These systems can be configured to use the minimum amount of cooling necessary to optimize the performance of submerged computing components.

Using existing cooling infrastructure to create colder immersion fluids streamlines retrofitting projects and enhances sustainability. While this strategy has a relatively narrow set of potential use cases, it offers considerable benefits in those instances.

Integrating AI with Immersion Cooling

When combined with human supervision, AI and machine learning technologies are able to markedly improve data center performance, efficiency, and security and — should problems arise—to take corrective action and issue alerts to human personnel.

For example, to ensure the performance and integrity of immersed IT equipment the engineered fluids used in immersion cooling require close monitoring and analysis. This is because ongoing cooling processes can lead to changes in immersion tank conditions. One example is the potential degradation of the fluid’s performance characteristics, and thus the need for technicians to replace these fluids over time.

AI immersion cooling technologies can be configured to monitor fluid conditions, ensuring their safety and performance integrity. And thanks to automation, AI monitoring tools have the power to substantially minimize the logistical complexity involved in immersion system maintenance. This, in turn, generates further efficiency benefits and cuts costs while elevating data center performance and security.

Furthermore, predictive AI technologies draw on vast troves of historical data to project future maintenance needs. This reduces the amount of human labor required for data center monitoring, generating cost savings and supporting the more efficient allocation of resources.

AI and immersion cooling
Source: Shutterstock

GRC Connects Data Center Operators With Next-Generation Immersion Cooling Technologies

Immersion cooling is an important innovation with the potential to transform data center management logistics. As such, data-intensive applications—including AI and blockchain technology—are primed to reshape the computing landscape. Looking to 2024 and beyond, data centers will require ever greater processing capabilities as these technologies become integrated into the computing mainstream.

The current generation of immersion cooling technologies supports high-density applications. Near-term advancements hold even greater promise, with immersion systems empowering engineers to reimagine heat sink designs, develop advanced TIMs, and use data center infrastructure in innovative ways. This is especially true when considering the additional performance advantages offered by AI and machine learning.

As an industry-leading innovator and partner to some of the world’s largest technology companies, GRC provides advanced immersion cooling solutions for data centers seeking to future-proof their operations. GRC is also an early adopter of AI technologies for data centers, further raising its profile as a key provider of next-generation technological solutions.

GRC’s product lineup includes micro-modular, rack-based immersion systems, along with cooling systems for blockchain applications, precision-engineered cooling fluids, and more. For further information, or to discuss your data center cooling needs in detail, contact GRC today.


4 Data Center Technology Advances to Look Forward to in 2024

Data center technology continues to advance at a rapid pace—promising many exciting developments on the horizon for 2024. This is also good news for businesses and organizations that rely heavily on data storage. In fact, legacy data center technologies are under pressure to consider their own limitations in the near future. This comes as tech experts predict that existing storage models may begin to struggle handling data volumes as early as 2025.

Emerging data center solutions will create exciting new capabilities by improving efficiency, expanding storage capabilities, and automating processes. This article highlights four advancements in data center technology that are set to make waves in the near future.

1. AI in Data Centers: Revolutionizing Efficiency

Artificial intelligence (AI) has taken major strides in recent years. Today, businesses are utilizing it for applications ranging from predictive analytics to resource allocation and operational efficiency. With respect to data center technology, AI is a promising driver of predictive maintenance. Furthermore, it’s a tool for reducing server failure rates and improving overall performance.

Emerging AI technologies excel at applying predictive capabilities to monitor data storage and management processes, energy usage, and signs of an impending failure. It’s here that AI and machine learning (ML) tools can harvest and analyze data from sensor networks and system logs in real time. As a result, they’re able to detect emerging trends that point to a possible failure and take preemptive actions to avoid it.

AI can also apply these capabilities to boost efficiency and reduce energy use. For example, algorithms can instantly analyze workload requirements and route them to the servers and resources best positioned to handle them. Moreover, AI is able to dynamically evaluate relationships between energy usage and workloads. As such, it automatically powers down servers when they are not in use, thereby improving data center cooling speed and efficiency.

These cooling and efficiency-boosting capabilities offer an added performance dimension. In fact, relative to the processing demands of conventional data, AI and ML require up to three times as much power density. This means that maximizing the performance and efficiency of data center technology will fast-track AI integration into mainstream computing.

2. Edge Data Centers: Redefining Proximity and Speed

Edge data centers are located on the periphery of networks near the end users they serve. This allows them to draw on the principle of proximity colocation. As such, edge data centers reduce latency by decreasing the amount of time it takes for data to travel from its origin point to its destination. In addition, they deliver highly reliable levels of user connectivity.

Internet of things (IoT) technologies rely heavily on rapid response and processing capabilities. Edge data centers are able to offer these functionalities, reducing network congestion while maintaining optimal levels of system performance.

Additional benefits of edge data center technology include:

  • Superior cybersecurity profiles
  • Easy scalability
  • Low and controllable costs facilitated by pay-as-you-go billing models

Furthermore, edge data centers can mitigate the environmental impacts associated with data processing and storage. They do this by boosting energy efficiency, consuming less power, and occupying a lower overall carbon footprint. Given these benefits, edge data centers have become a prominent feature of the ongoing trend toward modular and micro-modular data center models.

3. Sustainability and Renewable Energy Integration

Data center technology is increasingly oriented toward sustainability. In fact, many tech industry stakeholders have committed to working toward making data centers carbon-free. While this is not yet viable as we head into 2024, experts agree that carbon-free data centers are eminently attainable.

As the industry works toward complete carbon neutrality, a new generation of greener and more sustainable renewable energy models continue to make an impact. These advancements include:

Making better use of natural resources. Data centers are integrating clean and natural sources of electricity at ever higher rates. Solar and wind power account for an escalating share of their power needs. Further improvements are on the horizon as sustainable energy technologies continue to advance

Liquid cooling. Liquid-based data center cooling solutions harness the inherently superior thermal transfer characteristics of water and other fluid-based coolants. Liquid cooling solutions conducts more than 3,000 times as much heat as air cooling and requires less energy.

Air cooling. Although liquid-based approaches to cooling are effective, they also require large quantities of water. With greater stress on water resources, ambient air-cooling advancements have emerged as a strong alternate option. Air-chilled cooling systems create closed cooling loops that dramatically reduce water input needs, while “free cooling” approaches that simply exchange high-temperature internal air with cooler external air are also undergoing efficiency and performance improvements.

Immersion cooling represents a future-proof data center technology. GRC Cooling is a leader in this fast-evolving field, which uses dielectric liquids as a submersion agent for server components and computing hardware. Immersion cooling makes internal air-conditioning unnecessary which, in turn, drastically reduces energy requirements.

4. Security Advancements: Ensuring Data Integrity

Information security is currently in the spotlight as server-room and data-center design strategies advance. Strengthened cybersecurity protocols also help safeguard sensitive information stored in data centers. At present, the growing efficiency of environmental monitoring standards is providing an additional layer of security.

Chip-level security is an important example that illustrates how data center cybersecurity has improved. Used by Google and other major tech companies, chip-level security prevents tampering and makes it far more difficult for malicious actors to compromise a protected device or system.

Data centers can also use microgrids to address physical vulnerabilities, control security costs, and make site security more sustainable. As such, operators are free to invest in their own physical microgrids or incorporate them on site through as-a-service models.

GRC Cooling Supports Superior Levels of Data Center Performance and Sustainability

As we head into 2024, AI and edge-based models, together with sustainability and security improvements, continue to redefine the data center technology landscape. Each of these advancements can benefit from the next-generation data center cooling technologies available from GRC Cooling.

As a practical, cost-effective solution for all data center cooling needs, GRC Cooling serves an international network of major clients. Contact GRC Cooling to learn more about their advanced data center cooling technologies.


How Immersion Cooling Helps Unlock the Power of AI in Data Centers

AI applications are currently straining the already limited resources of data centers, where server racks are drawing load capacities of between 50 and 75 KW. Conversely, without AI, power requirements drop to 8–10 KW. But it’s not all bad news. Integrating AI into your data center can also help enhance its performance, cut energy consumption, and lessen the environmental impact. Add to this the latest patented immersion cooling technology, and your data center operations can expect further reductions in server energy requirements; cooling energy costs by up to 20% and 95%, respectively.

Read on to find out how AI, coupled with immersion cooling, can optimize data center efficiency and best practices in energy management.

How AI Enhances Efficiency and Sustainability in Data Centers

To successfully integrate AI into a data center, you’ll need accurate and high-quality data. Once integrated, AI can optimize power consumption in multiple ways, such as:

  • Monitoring traffic patterns and executing data center usage accordingly to minimize power consumption
  • Utilizing its predictive maintenance capabilities for timely service and component changes.

Benefits of Implementing AI in Data Centers

Given the amount and types of information that data centers receive, security is paramount. AI systems protect servers from unauthorized access both internal and external. In addition, they help guard against malware and physical breaches, thanks to smart alarm and intrusion detection systems.

With AI, you can power down unused servers and machines, which is a big step toward decarbonization and making the system more efficient. Moreover, this process helps regulate data center operations according to requirements. It’s able to power up more data racks during high demand, and vice versa.

However, despite these multiple benefits, there’s also a major drawback. Integrating AI at this level requires very high computing power. In fact, AI-enabled data center operations have 15 times greater than normal density requirements.

How Does the Growing Demand for AI Impact Data Center Costs?

Data centers are already under the microscope because of the excessive energy they consume. The integration of AI is further straining existing power resources, increasing operational costs. The conundrum here is that, while AI is meant to help data centers reduce power consumption, ultimately, it’s leading to an increase in the same.

A single AI model can take up thousands of kilowatt hours. Generative AI models like ChatGPT can consume 100 times more than that. This is because the graphics processing units for operationalizing AI models are built to draw more power. Additional power is required to operate the system as well as to run the cooling systems. This is where an immersion cooling data center is an innovate solution that can significantly decrease cooling costs.

CAPEX (capital expenditure) is another financial consideration influencing data center operations. Microsoft’s 2023 second quarter CAPEX is $10.7 billion, up from $7.8 in the previous quarter. A major chunk of that is spent on data centers. Then, we also have operational costs (OPEX), such as maintenance, electricity, and service equipment.

According to CBRE’s 2023 Investor Sentiment Survey Results, investments in data centers are slated to increase. The ROI in this industry is positive because the demand for data centers continues to rise. Further, McKinsey predicts that, in the US alone, data center demand will increase by 10% per annum.

How AI Improves Data Center Operations

The world generates tremendous volumes of data every second. As a result, data center operators have to scale their designs and power infrastructure. And this includes cooling systems to manage all the data.

Together, AI and machine learning can operate data centers more effectively. For example:

  • AI solutions administer racks and toggle their usage according to the requirements in real-time.
  • AI allocates computing power, network bandwidth, and storage based on processing demands.
  • The predictive capabilities of AI can help with timely maintenance. Moreover, using historical data and workload patterns, data center operators can optimize capacity planning.
  • Once ML and AI have learned normal operational behavior, they can report any sort of deviation; potentially preventing data breaches and hacks.

That said, AI-enabled data server racks do create significant amounts of heat and need quick de-escalation. In addition, data center operators will face compliance challenges as regulatory authorities continue to bring in new operational parameters.

How Data Centers Can Mitigate the Cost Increase Caused by AI

Cloud computing is a phenomenal technology, delivering innovative ways in which to interact and share information. However, these facilities come at a price to the data center operators that need to process huge volumes of information. And it’s here that AI can be used to optimize costs.

AI automatically controls data center equipment to ensure consumption efficiency and reduce power expenditure. Further, it manages the data center power balance, combating cooling and performance degradation by positioning workloads in just the right, most cost-effective energy zones. Moreover, AI-based security setups are able to analyze potential security threats and prevent malicious attacks.

Other ways to reduce the operational overhead of data centers include:

  • Smart software design. Removing underused applications and bloatware will decrease power demand, leading to cost reduction.
  • Better airflow management. With expert assistance, you can install a robust airflow system to ensure zero air leakage. This dedicates 100% airflow to cooling the servers.
  • Immersion cooling. This budget-friendly technology immerses data center equipment in a dielectric liquid. The immersion cooling systems require zero additional space and provide an effective solution for data center operators focused on bringing down the price of cooling.

An Innovative Solution: Immersion Cooling Your Data Center

Data centers have taken a lot of heat because of their increased appetite for power consumption and negative environmental profile. It’s a cause of concern for data center operators, but one that immersion cooling can address on many levels.

GRC’s patented immersion cooling system can easily handle the demands of high performance, high density deployments. It also delivers vastly superior cooling capabilities with greatly reduced power consumption compared with air cooling systems.

GRC’s solution allows you to put more computing in the rack and save valuable space in your data center. Moreover, GRC’s single-phase immersion cooling system has a simplified design that helps you eliminate the operational and maintenance costs of complex components, such as chillers, air handlers, and humidity control systems.

Get in touch with GRC now to learn about powering up—and cooling down—your systems more efficiently.


Challenges and Opportunities of FITARA Implementation for Data Centers

In 2014, Congress passed the Federal Information Technology Acquisition Reform Act, known as FITARA. The bill was written to address a longstanding government concern—how to curb IT waste in government agencies by optimizing risk management and transparency. And considering how the US government spends over $100 billion on IT equipment a year, there’s a fair amount at stake.

FITARA aims to rein in runaway energy consumption in data centers to hit truly efficient levels—thus also reining in costs. There are a variety of ways data centers can achieve this; liquid immersion cooling technology like that provided by GRC makes for one particularly pertinent leap forward. Liquid immersion cooling contains excess energy (in the form of heat), rather than simply dispelling it the way traditional air cooling does. It also uses less electricity than conventional methods, meaning the act of cooling itself will not emit as much carbon.

Of course, every data center wants to strive for efficiency, especially given the potential plus of government contracting once in compliance with FITARA. We’ve talked previously about certain best practices to help align with FITARA goals; however, FITARA is a complex piece of legislation, and compliance presents its own challenges and opportunities.

Let’s unpack these challenges in greater detail and see how you can actually make them work for you instead.

Technical Challenges

Among the technical challenges data centers may struggle with are consolidating and optimizing the infrastructure. Think of the difficulty in migrating data and applications from numerous disparate data centers into one location. Agencies also have to integrate multiple systems and ensure that they work together, which can pose serious hurdles.

Case in point, data center migration often results in unwanted downtime. But federal agencies must have consistently high uptimes to accomplish their missions. Another challenge is to keep latency low after migration. The data center set for closure may have been situated near its users, unlike the consolidated facility. Other potential risks of migration include wrongly sizing the resources at the target facility.

In addition, agencies must protect the security of their facilities both during and after the consolidation process. Moving data exposes it to threats such as data loss, and it’s hard to enforce the exact same precautions before and after a migration. Then there are performance standards such as processing and storage to consider. The consolidated infrastructure may function or be configured differently than the previous setup.

Succeeding against these technical problems can take a lot of time, money, and skill. But it’s not all bad news: data centers can take advantage of this opportunity to upgrade their resources and innovate. For instance, agencies could deploy more cloud technology or reshape their production systems.

Budget Management

When it comes to budget troubles, FITARA imposes the challenges of finding enough funding to make adjustments for compliance. There’s a trade-off between immediate costs and long-term benefits. You can’t consolidate and improve data centers without some expense, but the efficiency gains will repay the investment over time.

Another financial difficulty is to coordinate budgeting with the project schedule. Agencies should make multi-year plans, so it’s practically necessary to have the money to pay for ongoing costs as they occur. It’s not just about the upfront investment; you must also balance the longer-term projects and budgets.

To overcome these budget-related challenges, agencies can focus on the low-cost and high-return projects like liquid immersion cooling and work to maximize the value of available resources. For example, immersion cooling opens several opportunities through its knock-on effects. Data centers spend less on electrical infrastructure and real estate while cutting back on electricity use. You can literally slice expenses in half.

Federal agencies may also look for new funding sources to alleviate the financial challenges of adhering to FITARA. Reports should communicate the advantages and disadvantages of proposed actions to attract financing. If they calculate right, data center administrators can slim down both spending and waste.

Transparency and Automated Monitoring

Transparency is essential to show the value of IT spending at federal agencies. Indeed, transparency ranks at the top of FITARA recommendations from the Federal IT COST Commission. According to its advice, CIOs should clearly describe the costs and performance of IT assets. This transparency provides the detailed information that government IT directors and constituents need to navigate their challenges efficiently.

CIOs should employ a standardized model of costs to break down which services and applications are delivering value. This information also supports agencies in negotiating with suppliers and measuring the total cost of ownership (TCO). Basically, seeing the precise distribution of costs reveals how to improve IT spending.

A time-tested strategy that can help federal agencies optimize data centers is to implement automated monitoring tools. The GAO has recommended that agency-owned data centers use this method to track the use of servers and other resources.

Multiple parts of the government, such as the Department of Agriculture and the Department of Homeland Security, have agreed to move forward with this recommendation. After all, automated monitoring complements virtualization and migration in the cloud. Together, these steps increase data center efficiency while enabling accurate reports.

Photographer: David Evison

Rise to the FITARA Challenge With GRC

In the near decade since FITARA became law, it’s served as a North Star for data centers aiming for government contracting. And considering the breadth of need for IT services in government administration, attaining federal contracting can prove a lucrative boon for many data centers. The challenge is in attaining compliance.

The best way: curtailing waste and dialing in environmental and financial efficiency ASAP, by all avenues possible. When plotting this pivot, it’s best to start with low-risk, high-yield solutions first—like GRC’s liquid immersion cooling systems. As mentioned earlier, liquid immersion cooling is far more efficient than traditional alternatives. And with a straightforward installation process, it won’t necessarily demand an entire transformation of your data center, though it will transform your energy consumption!

GRC solutions are well-tested and proven winners; even federal agencies such as the National Security Agency (NSA) and the United States Air Force (USAF) use these immersion tanks for in-house operations. What better way to achieve federal compliance than with solutions the federal government already implements?

Overcoming the hurdles en route to achieving FITARA compliance requires a combination of strategies. But GRC can help you make a mighty leap on your journey there. Reach out today to find out just how liquid immersion cooling can help you optimize your data center.