Danfoss Power Solutions
Hoses, connectors and other fluid conveyance components are used in data center cooling systems

Hydraulics and Pneumatics Play Important Role in Data Center Cooling

May 21, 2025
Hoses, fittings and other fluid power components are being employed in the liquid cooling systems now utilized in many data centers.

Construction of data centers is increasing to meet rising demand for complex computation and data processing, fueled in part by rising adoption of artificial intelligence (AI). These facilities require a large amount of cooling to keep them running efficiently.

Fluid power components are playing an increasingly important role in many of the thermal management systems now being utilized in data centers.

Air-cooling methods have traditionally been used, but as the need for higher processing capabilities has grown due to AI and other data-intensive applications, more energy is being consumed which is resulting in more heat generation.

Because of this, many data centers are now using liquid cooling which provides a more effective and efficient way of transferring the heat being generated, said Mike Haen, Vice President of Global Marketing, Product Line Management, and Pricing at Gates.

Pumps, hoses, and other fluid power technologies are being utilized to help move the liquid through these cooling systems, providing an opportunity area for many companies in the hydraulics and pneumatics industry.

Fluid Conveyance Keeps Data Centers Running Cool

The liquid cooling systems used in data centers are essentially a fluid conveyance system which pumps a water glycol solution to cool the chips and other components utilized in these facilities.

As such, fluid power components are commonly used as part of these systems to move the cooling liquid; these components include:

  • hoses
  • fittings
  • thermoplastic tubing
  • pumps
  • manifolds
  • quick couplings
  • connectors
  • filters.

Joe Chopek, Business Development Manager, Fluid Connectors Group, Parker Hannifin, said the chosen cooling method for a data center is based on the capacity of the chips used in the facility. Over 150 kW, he said you can’t move enough air to provide adequate cooling. With chips now operating in the 200-kW range and above, “you have to go to a direct chip cooling solution. Water glycol can remove the heat adequately,” he said.

Liquid cooling’s ability to draw heat away from the chips keeps them performing as desired while also aiding reductions in energy consumption.  

Peter Bleday, Vice President, Specialty business unit, Fluid Conveyance division, Danfoss Power Solutions, said electricity accounts for about 80% of the costs associated with operating a data center. This electricity is used to power the computing technology as well as cooling systems in the facility.

“Liquid cooling is significantly more efficient,” he said. “You’re getting 20 or 30% more efficiency with liquid cooling…that obviously cuts down on the total cost of ownership for a data center.”

Utilizing a more efficient cooling technology reduces electricity use and enables significantly higher power loads to be operated through the data center said Bleday.  

Design Considerations for Fluid Power Components in Data Center Applications

According to Chopek, the closed loop, low pressure design of liquid cooling systems helps to make them a relatively simple solution. In addition, their use in a climate-controlled facility removes some of the environmental factors that might otherwise have to be considered if the system were in a piece of construction equipment exposed to harsh summer or winter weather conditions.

“But the other side of it is the cost of failure is exceptionally high,” he said. “Any kind of fluid leak could be extremely costly, so there’s a very high standard on leak-free [solutions].” Products that will be used in a data center application undergo a lot of testing to ensure they will provide leak-free operation he said.

Bleday also noted the importance of creating leak-free components. “Across all of our components we try and minimize leakage no matter what. But it’s even more critical with some of the data center applications because even a small amount of leakage can cause a significant amount of damage.”

Learn more about preventing leaks in hydraulic systems in the article "How to Reduce Expensive Fluid System Leaks and Emissions."

Additionally, fluid purity needs to be assured to prevent overheating of chips. Chopek explained that in a more traditional hydraulics application, once hydraulic fluids are moving through the system at high pressures and temperatures there is less concern about buildup of microbials. But because data center liquid cooling systems are using a water-based solution and lower pressures, there is more concern about the potential for microbial growth or even hoses breaking down over time and particulates getting into the liquid.

“Because the flow rates are so small at the chip level, any kind of blockage [can cause] the chip to overheat very fast,” he said. “So, they’re very sensitive to any kind of contaminant because if you can't cool the chips, they overheat quickly and then are a total loss.”

Ensuring efficient liquid flow is necessary as well. To do so, it is important to achieve high flow and low pressure drop across the quick disconnects and couplings, he said. This provides assurance that the cooling system’s pump operates at its most efficient. “It’s all about minimizing leakage near the electronics and maximizing the efficiency of the whole process,” he said.

Space constraints also need to be considered when developing products for data centers. There is often less installation space available in data centers, necessitating use of compact and easy-to-install components. Hydraulic hoses with tight bend radiuses are especially beneficial to ease their routing in these applications.

This was one of the many aspects that factored into the design of Gates’ Data Master data center cooling hose released in late 2024. Per Haen, it offers flexibility for easy routing and compactness to fit within tight server rack layouts. In early 2025, the company introduced another data center cooling hose to its line, the Data Master MegaFlex, which also offers flexibility for tight installation spaces while also delivering higher flow capacity.

Other important design features for the Gates data center hoses include the use of specialty-compounded materials to maintain system cleanliness as well as flame resistance.

“Designing hose solutions for data centers presents a distinct engineering challenge,” said Haen. “Ultimately, the design needs to consider the demands on the product in that application. 

“In mobile or industrial settings, challenges that can stress the product might include external abrasion, dynamic movement, or dramatic pressure changes. Data centers demand solutions that are very focused on mitigating contamination and leaks while valuing flexibility and compactness,” he said.

Haen said Gates drew from its experience developing fluid power technologies for multiple industries to aid the design of these hoses. “While Data Master MegaFlex was developed specifically for liquid cooling applications, the foundational technologies behind it reflect insights gained from hydraulic, industrial, and automotive systems,” he said.

“For example, the hose’s high flexibility construction is rooted in our work with industrial applications, while its chemical compatibility with glycol water and coolant fluids is informed by our history in automotive thermal management applications. This convergence of expertise allowed us to design a product that is well suited to an emerging set of technical demands.”

Bleday said Danfoss offers a mix of existing products and specially designed components for use in data centers. Depending on the industry and where in a data center the components will be used is typically the determining factor. Specific hoses, couplings and other components are developed for use inside server racks but as you move further out from the servers to where all the cooling and other equipment is located, he said many of Danfoss’ existing industrial hydraulic hoses, fittings and refrigeration components can be utilized.

A Market of Open Specifications

Chopek said the data center market is a unique one because of its open specifications. Instead of competing specifications that are slightly different from one manufacturer to the next as is the case in industries such as automotive, those in the data center market “decided that instead of competing, it would be an open specification that everyone would agree on,” he explained.

An independent body known as the Open Compute Project (OCP) was formed. Representatives from each of the big companies working in the data center market sit on that body and develop the specifications fluid power and other technologies used in data centers will need to meet.

“So, for a universal quick disconnect (UQD) coupling they’ll come up with the specs and say a UQD has to have this flow rate, this inner diameter, this outer diameter and this size footprint,” said Chopek.

Bleday equates OCP to ISO or SAE in terms of it acting as a joint standards group for the data center industry. “We're standardizing connections between components so that  competitors’ plug and socket on a quick disconnect can still go together and still perform to the needed function.”

Component suppliers can submit drawings for their products and if they meet the specifications set out by the OCP body, they can become a member and thus be seen as a qualifying supplier of technologies that will meet the needs of data center applications.

Overall, quality is the most important specification to meet said Chopek. Fluid power companies looking to serve the data center market need to demonstrate their products will be leak free and can meet all temperature, pressure and longevity requirements.

Danfoss, Gates and Parker Hannifin are all OCP members, enabling them to also bring their fluid power expertise to the discussion to ensure the standards put forth can feasibly be achieved without compromising desired performance.

Future Growth Potential in Data Centers

Haen said that as the adoption of AI and machine learning accelerates, data centers are facing unprecedented thermal management challenges. “Traditional air-cooling methods are approaching their performance limits and liquid cooling is rapidly becoming the preferred solution for managing intense heat loads,” he said.

As such, fluid power components are expected to continue playing an important role in data center applications.

Read "How Fluid Power can Benefit from AI and Machine Learning" to learn about the use of AI in hydraulics and pneumatics.

Bleday agreed that as the data center industry moves toward higher computing power, liquid cooling will continue to become more prevalent. He said Danfoss sees long-term growth for the liquid cooled side of the market. “It has been growing for the last couple of years, and it doesn’t look to stop growing any time soon.”

He said the ability to move water glycol across chips and the efficiency of that cooling technique will drive a lot of decision making in the data center market. “There are new technologies out there such as immersion cooling…but even then you still need refrigerant, you still need those capabilities to cool,” said Bleday. “I think there will always be fluid conveyance and fluid power in these applications.”

Many of the chips in production now for data centers are 200 kW but Chopek said the industry is already moving to 300 kW and there are already programs in place using 400 kW options. Current liquid cooling systems will be able to meet the thermal management requirements of these higher capacity chips, though it may bring about the need for higher flow rates he said.

The industry is moving quickly though and “at some point, the chip density will be such that you have to go to refrigeration,” he said. Parker does offer some refrigeration products, but it’s a different market than fluid and so there may need to be some investment to ensure refrigeration products are capable of meeting the needs of data centers.

He sees liquid cooling data centers remaining though just as there are still air-cooled facilities. The computing power provided by existing facilities is already being used, and so as each new cooling technology enters the market it is added on to what already exists.

“You can't make an air-cooled data center liquid cooled, and you can't make a liquid-cooled data center refrigeration cooled, so they kind of build on top of each other,” said Chopek.

“I would imagine that if 3 years from now, there's enough capacity on the liquid side, they may…close some of those air-cooled [facilities], but I think it [liquid cooling systems] will stick around as they make that transition to refrigeration for some period of time.”

It remains to be seen how cooling technology for data centers will progress in the coming years. Research firm IDTechEx is currently projecting continued growth for liquid cooling over the next 10 years. And a February 2025 report from Endeavor Business Media partner site Data Center Frontier indicates liquid cooling is a long-term trend for the data center market.

For the immediate term at least, liquid cooling appears to be the technology of choice for the data center market which will offer continued opportunities for hydraulic and pneumatic fluid conveyance technologies in the years to come.

About the Author

Sara Jensen | Executive Editor, Power & Motion

Sara Jensen is executive editor of Power & Motion, directing expanded coverage into the modern fluid power space, as well as mechatronic and smart technologies. She has over 15 years of publishing experience. Prior to Power & Motion she spent 11 years with a trade publication for engineers of heavy-duty equipment, the last 3 of which were as the editor and brand lead. Over the course of her time in the B2B industry, Sara has gained an extensive knowledge of various heavy-duty equipment industries — including construction, agriculture, mining and on-road trucks —along with the systems and market trends which impact them such as fluid power and electronic motion control technologies. 

You can follow Sara and Power & Motion via the following social media handles:

X (formerly Twitter): @TechnlgyEditor and @PowerMotionTech

LinkedIn: @SaraJensen and @Power&Motion

Facebook: @PowerMotionTech