The DRAM shortage extends until 2027 due to the AI craze

Memory is once again at the center of the tech storm. The pressure exerted by new Artificial Intelligence data centers on the supply chain has created an increasingly noticeable gap between DRAM supply and demand—the type of memory used in everything from servers to laptops, mobile devices, and much of consumer electronics. The latest warning comes from Nikkei Asia: at the current pace, major manufacturers are expected to meet only about 60% of global DRAM demand by the end of 2027.

This figure is significant not only for what it signals to the industry but also for what it is already hinting at in the market. We’re not talking about a temporary tension or a bottleneck lasting just a few months. According to the same information and estimates from Counterpoint Research, the problem is structural: balancing the market would require production to grow by 12% annually in 2026 and 2027, but current plans point more towards an increase of around 7.5%.

It’s not a lack of investment, but a matter of time

The industry is not standing still. Samsung has already announced mass production of HBM4 and the delivery of commercial products to clients, while Micron insists that its entire HBM offerings for 2026 are already committed in terms of price and volume. Additionally, the American manufacturer has purchased a new facility in Tongluo, Taiwan, to expand advanced DRAM capacity, including HBM. The issue is that this additional capacity won’t be immediate: Micron expects this new site to contribute significant shipments starting in its fiscal year 2028.

SK hynix is also accelerating its schedule. According to reports from South Korea, the company has already begun large-scale DRAM production at its M15X plant in Cheongju this month. Still, this expansion alone appears insufficient to fully compensate for the global mismatch. Nikkei Asia, cited by several outlets with access to the report, states that among Samsung, SK hynix, and Micron, only about 60% of demand will be covered by the end of 2027.

This explains why shortages persist despite new factory announcements. Building a semiconductor plant is not similar to expanding a logistics warehouse. These are complex projects, heavily capital-intensive, requiring specialized equipment and personnel, with lead times often measured in years rather than quarters. Micron itself acknowledges in its regulatory filings that these expansions are subject to delays due to equipment lead times, cleanroom space constraints, and difficulties aligning supply with actual demand growth.

Priority is no longer all DRAM, but memory for AI

The other key factor is the type of memory that is currently most profitable. Much of the new capacity being prepared is aimed at HBM, the high-bandwidth memory needed for modern GPUs and AI accelerators. This means that not every capacity expansion automatically translates into more conventional modules for PCs, mobiles, or consumer devices.

In fact, Micron clearly explains in its latest 10-Q report that manufacturing HBM requires more wafers and more cleanroom space to produce the same amount of bits as a conventional DRAM at the same process node. This technical detail—seemingly minor outside the industry—helps explain why the AI boom is also stressing the rest of the memory market. When an increasing portion of industrial resources is devoted to more complex, higher-margin data center products, the supply of general-purpose memory becomes more vulnerable.

The immediate result is a reallocation of priorities. Manufacturers are not stopping DRAM production but are shifting more capacity toward memory demanded by major AI projects. These clients also purchase far in advance. Micron has already sold out its 2026 HBM supply, and Samsung has rushed to market its new generation HBM4 for next-generation data centers. This strategy secures revenue visibility but also limits the relief that could reach the most price-sensitive segments.

Mobile, laptops, and consumer electronics will remain under pressure

For mobile and PC manufacturers, the outlook is less optimistic. Counterpoint revised downward its global smartphone shipment forecast for 2026 in December, cutting it by 2.1%, attributing the reduction to rising memory costs within the bill of materials. In January, the same firm forecasted memory price increases of 40% to 50% during Q4 2025 and Q1 2026, with an additional ~20% hike in Q2 2026.

This does not mean retail memory will disappear entirely or that all products will see immediate, uniform price hikes. However, it suggests a scenario of high prices and irregular availability, especially in entry-level ranges and products with narrower profit margins. In other words, the impact of AI on memory will not only be felt in large data centers or in the financial statements of chipmakers but could also trickle down to end consumers through more expensive devices, less generous configurations, or slower upgrade cycles.

Beyond market noise, a fundamental shift is emerging in the memory economy. For years, DRAM was characterized by a cyclical business, with phases of abundance followed by sharp price drops. The AI explosion is disrupting that pattern because part of the demand is becoming a strategic, sustained necessity. Micron estimates that the HBM market could reach $100 billion by 2028, surpassing the total size of the DRAM market in 2024. If this forecast holds, memory will cease to be just another component and will become a major battleground in the AI era.

Therefore, the conclusion is not that investment is lacking—there is plenty of it. What is missing is time for that investment to translate into actual capacity and a proper balance between the memory demanded by large AI projects and that needed by other parts of the industry. Until that equilibrium is achieved, everything indicates the market will continue to oscillate between scarcity, high prices, and fierce competition for supply. For now, 2027 does not seem like a definitive solution date but rather the earliest horizon to start seeing some relief.

Frequently Asked Questions

What is DRAM and why is it so critical in the current crisis?
DRAM is the main memory used in servers, computers, mobiles, and many other devices. The current crisis matters because the expansion of AI is consuming industrial capacity that also requires this general-purpose memory, especially as manufacturers prioritize more advanced products like HBM.

Why is Artificial Intelligence increasing the cost of PC and mobile memory too?
Because major manufacturers are dedicating an increasing share of resources to HBM, the high-performance memory used in AI accelerators. Since HBM consumes more wafers and manufacturing space per bit, there is less margin to ease the supply of conventional DRAM.

When might the memory market start to normalize?
Current forecasts suggest that tensions could persist until at least 2027. Even new facilities announced by manufacturers will take time to produce significant volume, and for Micron, some of that capacity may not come online with full force until 2028.

Will there be a total shortage of RAM in stores?
Not necessarily. The most probable scenario is irregular availability and high prices rather than a complete disappearance. The impact will depend on segments, memory types, and how demand from AI-focused data centers evolves relative to traditional consumer use.

via: Nikkei

Scroll to Top