2025, the Year the Industry Ran Out of Brakes: Factories, Mega-Clusters, and the Shadow of a Bottleneck

The computing, storage, and networking sector wrapped up 2025 with an unusual sense of déjà vu—even for an industry accustomed to rapid changes: living through multiple “historic moments” simultaneously. The year generated headlines ranging from the return of semiconductor manufacturing to the U.S. to a cascade of multi-million dollar deals centered around Artificial Intelligence, while quantum computing advanced amid promises, skepticism, and new public investments.

But the most uncomfortable plot twist came at year’s end: looking ahead to 2026, the debate shifted beyond simply producing more processors. The industry faces a potential capacity bottleneck in storage and memory chips, with technologies already considered virtually committed for the next two years. In other words: no matter how many factories and data centers are built, the supply chain has weak points, and not all of them are related to CPUs or GPUs.

“The fab more”: bringing chip manufacturing back to the U.S.

One of the year’s key themes was the push for domestic manufacturing in the United States, initially sparked by the CHIPS and Science Act and continued throughout 2025 under the new political landscape in Washington. The prevailing narrative became that of an economy aiming to produce more within its borders—though this resilience comes at a cost.

Among the most notable announcements, in scale, was Apple’s $600 billion US manufacturing program. A revealing detail emerged during the review: Apple reportedly did not receive funding from the CHIPS Act, which suggests that the decision was driven more by pragmatism than industrial epicness. The looming threat of tariffs on products manufactured in China—where Apple assembles most of its iPhones—serves as a pressure point pushing the company to relocate parts of its supply chain.

The program hinges on a network of industrial partners including Applied Materials, Amkor, Broadcom, Corning, Coherent, GlobalFoundries, GlobalWafers America, Samsung, Texas Instruments, and TSMC, with a promise of 20,000 jobs in the U.S. focused mainly on R&D, silicon engineering, software development, AI, and machine learning.

TSMC also set a clear timeline: in March, it announced plans to invest $100 billion in its U.S. manufacturing efforts, nearly doubling its previous commitment of $65 billion for its Phoenix (Arizona) site. Micron increased its investment to $200 billion, a rise of $30 billion over previous plans. The industry’s symbol of “made in Arizona”—chips—appeared when AMD and Nvidia revealed their designs were already being produced at that plant.

However, optimism is tempered by costs: in July, AMD CEO Lisa Su stated that manufacturing in Arizona would be between 5% and 20% more expensive than in Taiwan. The defining phrase of the year, almost an industry confession, was that paying this premium is seen as “a good investment” to ensure resilience.

The state as investor: stakes and conditions

Another prominent sign of the changing era was increased state intervention—not only as regulator or financier but as stakeholder. Notably, in August 2025, U.S. Secretary of Commerce Howard Lutnick indicated that the government would seek to take stakes in beneficiary companies. Soon after, a 9.9% stake in Intel was announced, backed by a public investment of $8.9 billion: $5.7 billion from CHIPS Act-related funds and $3.2 billion from the Secure Enclave program.

Intel, expected to capitalize on the “chip boom,” had a confusing year: leadership changes with Lip Bu Tan becoming CEO, restructuring of its executive team, thousands of layoffs, cancellation of fab plans in Europe, and even investments from a former rival, Nvidia.

Meanwhile, similar moves occurred along the supply chain: the Department of Defense acquired a 15% stake in MP Materials (rare earth magnets), and the Department of Commerce reached an agreement with startup xLight, investing $150 million for an undisclosed equity position.

The AI economy: chain deals and dizzying figures

If 2024 was the year of AI’s popularization, 2025 was marked by industrial-scale funding. The landscape featured OpenAI and Nvidia at the center of a carousel of agreements. The year kicked off with Stargate: a $500 billion project announced alongside Oracle, SoftBank, and MGX (Abu Dhabi) to build large data center campuses in the U.S. over four years, with subsequent extensions to other regions.

From there, the market accelerated rapidly: cloud capacity acquisitions, energy commitments, cross-investments, and gigawatt deployment promises. The cycle is summed up almost ironically: “Microsoft invests in OpenAI; OpenAI buys chips from Nvidia; Nvidia invests in OpenAI; Nvidia sells chips to Oracle; OpenAI buys capacity from Oracle; Oracle invests in OpenAI.” Along the way, AMD, Broadcom, and CoreWeave entered into deals involving equity stakes, custom hardware development, and deployments planned through 2029.

The contrast is clear: OpenAI was valued at $500 billion after a stock sale by employees in October—including $6.6 billion in shares sold by current and former staff—yet it remains unprofitable. This raises questions on how the company will fund the commitments and projections that target $200 billion in revenue by 2030. In November, HSBC analysts estimated it would need an additional $207 billion in funding before the decade’s end to sustain its expansion plans.

Debates about a “bubble” persisted: IBM’s CEO Arvind Krishna questioned the profitability of gigawatt-scale data centers, while Nvidia’s Jensen Huang responded with a phrase reflecting the year’s tone: if quarterly results are poor, it’s “proof” of a bubble; if good, it “fuels” the bubble. Nvidia, which according to the report neared $5 trillion in market cap, announced record quarterly revenues of $57 billion, up 62% year-on-year.

Quantum promises advance amid plans and skepticism

In quantum computing, 2025 did not resolve fundamental questions—when will practical utility arrive, if quantum advantage exists, which approach will dominate—but it did increase institutional commitment. The review notes that IBM remains skeptical about some claims of “supremacy,” though it expects to reach a key milestone by late 2026 and aims for a fault-tolerant system by 2029.

Governments accelerated their efforts: DARPA selected 11 companies for Phase B of its Quantum Benchmarking Initiative, aiming to evaluate whether constructing a fault-tolerant quantum computer within a decade is feasible and if any approach can reach practical operation by 2033. Canada announced over $74 million CAD ($52 million USD) for quantum initiatives, while the UK committed £121 million (around $160 million USD) and strengthened international collaborations. In Europe, EuroHPC launched its first two systems: PIAST-Q in Poland and VLQ (from IQM) in the Czech Republic.

2026, with storage and memory as points of tension

The year’s conclusion leaves an uncomfortable realization: industry can build factories and promote AI as the next layer of everything, but if storage and memory become bottlenecks, growth will depend less on the “best model” and more on who can supply, supply capacity, and meet delivery timelines. After a record-breaking 2025, 2026 looks poised to be the year when the supply chain determines how much of that enthusiasm can truly translate into operational infrastructure.


Frequently Asked Questions

Why is there talk of a “memory and storage bottleneck” in 2026?
Because sector-wide assessments point to increasing pressure on memory chips and storage technologies, with part of the supply already committed years in advance due to AI demand and large data centers.

What are the implications of manufacturing chips in Arizona if it’s up to 20% more expensive?
It means additional costs that companies accept as a resilience premium: domestic production, reduced geopolitical dependence, and a closer supply chain—though with direct impacts on margins and prices.

What is Stargate, and why is it cited as a symbol of 2025?
It’s described as a $500 billion project to deploy large-scale data center campuses, representing AI’s shift from software toward massive physical investments in energy and infrastructure.

What does it mean that governments invest in quantum and select companies for programs like DARPA’s?
It indicates that quantum computing is considered strategic, and that states aim to assess—using comparable criteria—whether any technology can scale toward fault-tolerant systems with practical utility within the next decade.

References: datacenter dynamics

Scroll to Top