Liquid Cooling
Last updated
Copyright Continuum Labs - 2023
Last updated
Here are the key points related to the cost of liquid cooling, power conservation, and the implications of increasing density in data centres:
Implementing liquid cooling in existing data centres is a significant challenge and can be costly, as it requires modifications to the infrastructure, such as connecting to existing pipework, ensuring leak detection and containment, and integrating with the building management system (BMS).
The cost of liquid cooling depends on the specific technology used, such as immersion cooling or cold plate cooling, and the scale of implementation.
Retrofitting liquid cooling into legacy data centres may involve additional operational costs and risks, such as the need for specialised expertise and the potential for disruptions to existing customers.
Designing a data centre from the ground up to accommodate liquid cooling can be more cost-effective than retrofitting an existing facility, as it allows for proper planning and infrastructure integration.
The cost of liquid cooling also depends on the level of redundancy and availability required. Implementing features like leak detection, containment, and separate cooling loops for each customer can add significant costs to the system.
Standardisation and collaboration among industry players could help reduce the costs of liquid cooling by promoting common practices, interfaces, and equipment compatibility.
Liquid cooling can potentially reduce the power consumption associated with cooling in data centres, as it allows for more efficient heat removal compared to traditional air cooling methods.
By using liquid cooling, data centres can operate at higher temperatures, which reduces the need for energy-intensive chillers. In some cases, chillers may be eliminated altogether, replaced by dry coolers or other more efficient cooling methods.
The power savings from liquid cooling can be significant, with estimates ranging from a 10-15% reduction in overall data centre power consumption, as cooling can account for 20-25% of total energy usage.
However, the power savings from liquid cooling may be relatively small compared to the increasing power demands of high-density IT equipment, such as AI accelerators and high-performance computing systems.
To maximise power conservation, data centres can explore techniques like heat reuse, where the waste heat from liquid-cooled systems is captured and used for other purposes, such as heating nearby buildings or industrial processes.
The demand for higher-density computing, driven by AI and other advanced workloads, is rapidly increasing, with rack power densities reaching 55-120 kW or more.
Increasing density poses significant challenges for data centre cooling, as traditional air cooling methods struggle to keep up with the heat generated by high-density racks.
Liquid cooling is seen as a key solution to enable higher densities, as it can effectively remove heat from high-power components and allow for more compact rack designs.
However, increasing density also has implications for power distribution and overall data centre infrastructure. Higher-density racks require more power, which may strain existing electrical systems and require upgrades to transformers, switchgear, and other components.
Increasing density can also impact data centre layout and space utilisation, as high-density racks may require specialised containment systems, dedicated cooling loops, and other infrastructure modifications.
To accommodate increasing density, data centres may need to adopt modular or containerised designs, which allow for more flexible and scalable deployment of high-density systems.
Ultimately, the ability to support increasing density will depend on a data centre's ability to provide adequate power, cooling, and infrastructure to meet the demands of high-performance computing systems.
In summary, the adoption of liquid cooling in data centres is driven by the need to support increasing rack densities and improve power efficiency.
However, implementing liquid cooling can be costly and challenging, particularly in legacy data centres.
Power conservation benefits from liquid cooling, while significant, may be outpaced by the growing power demands of high-density systems.
As density continues to increase, data centres will need to carefully plan and invest in infrastructure upgrades, modular designs, and advanced cooling technologies to keep pace with the evolving requirements of AI and other demanding workloads.
QCT's L04V air-assist liquid cooling solution is designed to address the growing need for efficient cooling in data centres as the power per socket increases, especially beyond 300W.
The system is built on a standard 42 unit rack and is compatible with various rack sizes.
Here's a detailed explanation of how it works and its key features:
Liquid-to-Air Cooling
The system uses a closed-loop liquid cooling system with cold plates covering the CPUs.
Cold coolant enters the server through blue cables, absorbs heat from the CPUs, and exits as hot coolant through red cables.
The hot coolant is then passed through a rear door heat exchanger, where fans dissipate the heat, cooling the liquid before it returns to the servers.
Liquid-to-Liquid Cooling
For data centres with existing liquid cooling infrastructure, QCT offers a liquid-to-liquid version of the L04V.
In this configuration, the rear door radiator and fans are replaced with a plate exchanger and flow control valve.
This allows the system to integrate with the data centre's existing liquid cooling loop.
The heart of the L04V system is the 4U CDU located at the bottom of the rack.
4U CDU Placement: Positioned at the bottom of the rack, taking up 4 units of rack space.
It features hot-swappable pumps to ensure continuous operation even if one pump fails.
The CDU also includes a filter to maintain liquid quality and an LCD monitor for status monitoring.
QCT has integrated a server-grade Baseboard Management Controller (BMC) chip inside the CDU, running open BMC code.
This allows for advanced management capabilities, such as monitoring server health, cooling requirements, and leak detection.
The CDU can intelligently control cooling pumps and rear door exchanger fans to optimize power efficiency based on server load and cooling needs.
Liquid cooling enables better thermal management, allowing CPUs to run at higher and more consistent clock speeds (turbo boost) compared to air cooling.
The L04V system can achieve up to 70% power savings compared to traditional air cooling solutions by efficiently cooling servers and operating at higher ambient temperatures (up to 40°C).
The higher ambient temperature reduces the need for power-hungry HVAC systems, significantly lowering the data centre's Power Usage Effectiveness (PUE).
Although liquid cooling has a higher initial capital expenditure (CAPEX) compared to air cooling, its operating expenses (OPEX) are significantly lower.
The TCO of liquid cooling grows at a slower rate over time, making it more cost-effective in the long run.
Depending on the system configuration and electricity costs, data centres can start seeing a positive ROI within a few years of deploying liquid cooling.
The L04V system offers a web UI for live monitoring of the liquid cooling solution, individual server health, and location.
It also provides Redfish API support for integration with existing data centre management tools.
An LCD monitor on the rack provides real-time information for on-site servicing.
Thorough planning and design to ensure compatibility with existing infrastructure and future scalability.
Regular maintenance of the liquid cooling system, including monitoring coolant quality and replacing filters as needed.
Utilizing the management and monitoring capabilities to optimize cooling performance and efficiency.
Training personnel on liquid cooling technology and service procedures.
Gradually transitioning from air cooling to liquid cooling, allowing for a smooth migration and minimizing disruption to data centre operations.
In summary, QCT's L04V air-assist liquid cooling solution addresses the growing need for efficient cooling in high-density data centres. Its innovative design, advanced management features, and cost benefits make it an attractive option for data centers looking to optimise performance and reduce operating expenses in the face of increasing power demands.