Samsung Rectifies: Cuts HBM and Repeats Commitment to DDR5, LPDDR5, and GDDR7

Samsung has decided to hit the brakes in the race for HBM3E memory for Artificial Intelligence and to refocus on a territory they know well: general-purpose DRAM for PCs, laptops, and servers. After months of shifting production lines to HBM to compete head-to-head with SK Hynix, the South Korean company has initiated a strategic pivot toward allocating more capacity to DDR5, LPDDR5, and GDDR7, right in the middle of a memory price upcycle.

The move has surprised the industry. On paper, HBM3E is the star product of the moment: it’s the memory that powers AI GPUs from NVIDIA, AMD, and other players, with demand still soaring. However, the economic realities behind each chip have forced Samsung to recalibrate their priorities.

From Betting Everything on HBM… to Returning to “Old” DRAM

Just over a month ago, Samsung was shifting much of its DDR5, LPDDR5, and GDDR7 lines toward HBM manufacturing, trusting that the AI boom and contracts with major clients like NVIDIA would make the gamble pay off. The goal was clear: to regain ground from SK Hynix, which currently dominates the HBM market with over half of the global share, approaching 60% in some quarters.

But the plan hasn’t panned out as expected. HBM3E is complex to produce—it requires more layers, advanced packaging, and strict quality control. Still, the margins Samsung is earning are much narrower than anticipated, partly because they’ve been forced to offer aggressive pricing to secure large contracts where SK Hynix had an advantage.

Industry leak analysis suggests that Samsung has sold HBM3E with operating margins around 30%, whereas general-purpose DRAM (DDR5, LPDDR5, GDDR7) can exceed 60% profit margins in the current scarcity and rising prices environment. In other words, “normal” memory for PCs and servers today is twice as profitable per unit of capacity as the sophisticated HBM for AI.

A 180-Degree Turn Amidst the Historic RAM Boom

Samsung’s new plan involves converting between 30% and 40% of its production capacity from node 1a to node 1b, focusing on DDR5, LPDDR5X, and GDDR7. This shift would free up approximately 80,000 wafers per month for general-purpose DRAM, according to internal estimates cited by industry media.

The market context supports this: during 2025, memory prices have experienced a historic surge. Some analyses indicate that mainstream DDR5 modules already cost at least twice as much as mid-year, and contractual prices for DRAM have risen over 170% from cycle lows.

This market spike aligns perfectly with Samsung’s strategy: if HBM3E prices are projected to decrease by over 30% post-2026 according to several forecast firms, competing by price cuts against SK Hynix becomes a tough battle. Conversely, strengthening its position in DDR5, LPDDR5, and GDDR7 allows Samsung to better capitalize on the upward cycle of the DRAM market.

What Does This Mean for PCs, Laptops, and Graphics Cards?

The core message is clear: Samsung will allocate more resources to memory products reaching “ordinary” users, even indirectly via device manufacturers and GPU integrators.

  • DDR5 for desktops and servers: increased production capacity should alleviate, at least temporarily, shortages and curb the most aggressive price hikes. However, 80,000 wafers per month for the global DRAM market is still a drop in the ocean, so a price collapse isn’t expected—only perhaps a short-term breather.
  • LPDDR5/LPDDR5X for laptops and mobile devices: Samsung’s decision also benefits low-power memory used in notebooks, tablets, and other portable devices. As manufacturers raise base memory configurations to meet AI and OS demands, ensuring supply with good margins is crucial.
  • GDDR7 for next-gen GPUs: Samsung was among the first to announce GDDR7 for upcoming high-end graphics cards, promising performance increases of up to 40% over GDDR6. Shifting more capacity to GDDR7 positions Samsung better for the arrival of new GPU generations from NVIDIA, AMD, and Intel.

In summary: in the short term, we may see a bit more DDR5 module stock and some price stabilization. But with the memory market still in a clear uptrend, any decline is likely just a temporary illusion lasting a few months.

The HBM Paradox: Selling Everything… and Still Suffering

The most striking part of this story is that Samsung is selling almost all the HBM3E it can produce—especially after surpassing stringent technical certifications from NVIDIA and other major clients. Yet, the business doesn’t seem to add up.

While SK Hynix enjoys a dominant position in HBM with higher prices and better cost structure, Samsung has had to enter the market offering discounts that chip away at margins. Now, with projections of a more than 30% decrease in the average price of HBM3E in the coming years and only two major players remaining, the question is simple: does it make sense to continue deploying capacity on a product that yields less profit than traditional DRAM?

Several industry executives from South Korea have been frank: in the current environment, manufacturing HBM3E can nearly be a “loss on paper” compared to dedicating the same lines to DDR5, LPDDR5, or GDDR7. High-bandwidth memory remains strategically critical—enabling participation in high-performance AI conversations—but it is no longer the central profitability driver.

A Reversion That Will Impact the Market Through 2026

This strategic reversal is so profound that Samsung isn’t just converting HBM lines back to DRAM; they’re also reallocating NAND Flash capacity from their major Pyeongtaek and Hwaseong campuses toward DDR5, LPDDR5, and GDDR7. The message to the market is clear: in 2026, the priority will be to maximize the uptrend in DRAM.

For end users and businesses, this will mean:

  • More memory module availability, reducing stockouts for certain models.
  • Less extreme volatility in prices short-term, with more gradual increases.
  • But, overall, memory prices will remain high in 2026, supported by solid demand from PCs, servers, AI data centers, and portable devices.

The only upside is that scarcity shouldn’t be as severe as during recent cryptocurrency mining booms, which saw resale prices soar to absurd levels. Now, the challenge is more structural: memory has become a critical resource in the AI economy, prompting manufacturers to act accordingly.

Frequently Asked Questions About Samsung’s Memory Strategy Shift

Why is Samsung reducing its bet on HBM3E when AI demand is skyrocketing?
Because despite high demand, Samsung’s profit margins on HBM3E are lower than with DDR5, LPDDR5, and GDDR7. Aggressive pricing to compete with SK Hynix and forecasts of falling HBM prices make general-purpose DRAM more profitable per unit of capacity today.

Will DDR5 RAM prices for PCs and servers drop?
A slight inventory buildup and some price stabilization may occur in the short term thanks to increased manufacturing capacity. However, the DRAM market remains in a bull phase, with continued growth in AI, data centers, and next-gen PC memory—so prices are likely to stay high through 2026.

What will be the impact on GPUs and GDDR7 memory?
By allocating more capacity to GDDR7, Samsung aims to better supply next-generation GPU makers. This could help prevent a severe shortage of graphics memory in upcoming cards, although it does not guarantee lower prices, as final costs will also depend on demand from gaming and AI markets.

Does this mean Samsung is abandoning the HBM race for AI?
Not entirely. Samsung continues to produce HBM3E and maintains agreements with major AI clients, but it is no longer willing to devote as much capacity to a product with narrower margins. The company seeks a balance: remaining relevant in high-bandwidth memory for AI while maximizing profitability with DDR5, LPDDR5, and GDDR7.

Sources: elchapuzasinformatico
Market analysis of HBM and SK Hynix’s share.
Recent reports on DDR5 cost escalation and DRAM price increases in 2025.
Samsung’s GDDR7 product info and roadmap.

Scroll to Top