Schneider Electric Strengthens Its Commitment to Liquid Cooling with New CDUs Designed for the “AI Era”

The rising density in data centers—driven by HPC (high-performance computing) workloads and, most notably, by the expansion of Artificial Intelligence—is forcing a rethink of something as basic as where and how heat is dissipated. In this context, Motivair by Schneider Electric has announced the release of two new refrigerant distribution units (Coolant Distribution Units, CDUs) specifically designed to address the thermal pressures caused by modern accelerated computing deployments.

The new models, MCDU-45 and MCDU-55, are presented as the industry’s first “purpose-built” CDUs optimized for installation in utility corridors, a location increasingly demanded by operators who want to avoid “invading” the white space with auxiliary equipment or sacrificing valuable IT space. The company assures that both units are now available worldwide, with production ramping up in early 2026—a direct nod to market urgency: each month, more AI projects are surpassing traditional density thresholds.

From the white space to service corridors: the silent shift

Liquid cooling has been growing for some time, but this announcement highlights a very specific trend: CDUs are moving outside the white space. In many AI factory or training cluster designs, space constraints and maintenance accessibility have become critical variables. Placing a CDU in a technical corridor not only frees up usable surface; it also allows for interventions with less risk of disrupting IT operations or affecting hot and cold aisle separations.

Motivair advocates that this “deployment flexibility” responds to a new set of needs. AI infrastructure not only demands higher capacity; it also requires better thermal support integration within the building—ranging from chilled water plants to secondary circuits powering server cooling systems.

Expanded operational range for chilled water and enhanced efficiency

A central message of the announcement is that the new MCDU-45 and MCDU-55 expand capabilities, functionalities, and design conditions to support a broader temperature range for chilled water. Operationally, this opens the door to optimizing heat rejection processes and, in certain scenarios, reducing overall energy consumption (with the usual goal of improving the PUE or at least containing it as power densities increase per rack).

Additionally, the company emphasizes practical benefits: simpler maintenance with improved accessibility. In high-density data centers, servicing a cooling element can become delicate if it requires entering the compute environment. Moving equipment to service zones not only frees space but may also reduce operational friction.

A broader range for hyperscale, colocation, edge, and retrofit projects

Schneider Electric frames this launch within an end-to-end liquid cooling portfolio that combines floor-mounted CDUs and rack-level units. The aim, according to the company, is to serve everything from hyperscale deployments to colocation, edge, and especially retrofit projects—adapting existing rooms to AI workloads without rebuilding the data center from scratch.

The company also highlights compatibility with advanced thermal management strategies: precise flow control, real-time monitoring, and adaptive load balancing. Furthermore, the new units are positioned within a broader product family, from MCDU-25 to MCDU-60, supporting the idea that a “complete” range enables tailored thermal designs according to deployment type and demand profile.

“Flexibility is key”: Schneider’s clear message

Andrew Bradner, senior vice president of cooling at Schneider Electric, summarizes the release’s core message in one phrase: “Flexibility is key” when it comes to liquid cooling in data centers. Customers—according to his vision—are demanding more diverse portfolios to better fit various deployment strategies. The message is intentional: in AI, infrastructure is often “custom-designed,” and operators value options that don’t constrain building architecture.

Rich Whitmore, CEO of Motivair, echoes this point from another perspective: the goal is to provide “next-generation” solutions capable of adapting to any HPC or AI deployment with scalability and reliability when it matters most. In a sector where availability is crucial and downtime costs can skyrocket, the underlying subtext is clear: cooling is no longer just a subsystems—it’s at the core of operations.

A strategic announcement: Motivair’s acquisition and the 2026 timetable

This announcement also signals ongoing continuity after Schneider Electric’s acquisition of a controlling stake in Motivair, with these CDUs represented as the first new products announced since that move. The timing is notable: as operators accelerate AI plans, they also demand suppliers with robust industrial capabilities to scale manufacturing, support, and supply chains.

Furthermore, the indication that production will increase in early 2026 leaves a clear message: demand exists, but industrial lead times matter. In liquid cooling—as with chips—the bottleneck may be in manufacturing and deployment capacity aligned with market pace.


FAQs

What is a CDU, and why is it key in liquid cooling for AI and HPC?
A Coolant Distribution Unit creates and manages the secondary liquid cooling circuit that feeds IT equipment (e.g., direct-to-chip systems). It helps control flow rate, temperature, and heat transfer to the building’s primary circuit.

What are the benefits of installing a CDU in a “utility corridor” rather than in the white space?
It frees up IT space, enhances service accessibility, and reduces interference with computing operations—especially relevant in high-density deployments.

When will the new MCDU-45 and MCDU-55 be fully available?
Motivair by Schneider Electric indicates global availability with increased production scheduled for early 2026.

What types of data centers are these new CDUs designed for?
They target hyperscale, AI, colocation, edge, and retrofit projects that require more flexible thermal strategies.

via: Schneider Electric

Scroll to Top