The data center switch market for AI infrastructure will exceed $100 billion.

Here’s the American English translation of the provided text:

The exponential growth of Artificial Intelligence (AI) has driven demand for specialized infrastructure in data centers. According to a recent report by the analysis firm Dell’Oro Group, the market for AI back-end network switches is expected to exceed $100 billion in the next five years. The trend indicates a shift in networking technology, with an increase in the adoption of Ethernet and a decline in the prominence of InfiniBand.

Ethernet Gaining Ground in AI Data Centers

Currently, Ethernet switches dominate the market, but analysts predict that InfiniBand installations will increase in the coming years. However, the recent update of projections by Dell’Oro Group suggests that Ethernet will continue to solidify as the primary interconnection technology in large AI clusters.

Sameh Boujelbene, vice president of Dell’Oro Group, explained that despite some uncertainties regarding the future of accelerated infrastructure, demand for high-performance hardware remains strong. “As more AI clusters adopt Ethernet as their primary infrastructure, the transition from InfiniBand to Ethernet will occur a year earlier than our previous estimates,” he commented.

Even large-scale clusters based on Nvidia GPUs, such as ‘XaIs Colossus’, are opting for interconnection via Ethernet, reinforcing its growing acceptance in the industry.

Providers and Changes in Market Dynamics

The AI switch sector has been dominated in 2024 by Celestica, Huawei, and Nvidia, although competition will intensify in 2025. Companies like Accton, Arista, Cisco, Juniper, and Nokia could gain market share and reshape market dynamics.

Additionally, Hewlett Packard Enterprise (HPE) has announced the multibillion-dollar acquisition of Juniper Networks, although this move faces opposition from the U.S. Department of Justice, which could impact the market structure in the short term.

Increase in the Capacity and Speed of Switches

The processing needs of AI require increasingly faster connectivity. According to Dell’Oro Group, most switch ports used in AI back-end networks will reach 800 Gbit/s in 2025, 1,600 Gbit/s in 2027, and up to 3,200 Gbit/s by 2030. This increase in capacity will allow for the management of more intensive workloads and ensure the efficiency of data centers running large-scale AI models.

Cloud Giants Leading Demand

The primary customers for these switches will continue to be Tier-1 cloud service providers, which represent the bulk of the demand. However, Dell’Oro Group has revised its forecast for AI infrastructure adoption upward among Tier-2 and Tier-3 companies as well as large corporations, due to the rapid expansion of AI models in sectors outside of technology.

With the explosion of generative AI and the growth of specialized data centers, investment in network infrastructure will continue to rise. Ethernet is emerging as the dominant standard in AI back-end networks, and companies in the sector are preparing for a rapidly evolving market in the coming years.

Scroll to Top