The race for dominance in the high-bandwidth memory (HBM) market is entering a new chapter. Samsung Electronics, which has lost ground to SK hynix and Micron in recent years, appears to be preparing an unprecedented offensive to regain prominence. According to sources from the South Korean industry, the company has decided to play aggressively in the HBM4 arena, even if that means entering a true price war that will squeeze its margins to the limit.
The goal is clear: to secure a place in NVIDIA’s supply chain, the largest consumer of HBM worldwide due to the rise of its artificial intelligence chips. With the next-generation “Rubin” GPUs on the horizon, the supplier that manages to establish a solid partnership with NVIDIA will secure multimillion-dollar contracts and global visibility. Samsung does not want to be left out.
A high-risk strategy: more volume, even if profit margins decrease
South Korean media, particularly The Bell, has revealed that Samsung is producing an unusually high amount of HBM4 samples: around 10,000 test wafers, a figure that far exceeds industry standards. The reason is simple: manufacturing yields (the ratio of valid chips to defective ones) are still not optimal.
Samsung has decided to absorb this extra cost and manufacture more than necessary to ensure they have enough “good” units when it’s time to compete for contracts. According to local analysts, the company “is pushing the machine to compensate for an initially lower performance compared to rivals.”
The technical challenge also works against them: while SK hynix and Micron are already using fifth-generation (1b) DRAM memory in their production of HBM3E and HBM4, Samsung has opted for a bigger leap by integrating sixth-generation (1c) DRAM. This technology is not yet mature, which explains the initial lower efficiency.
Nevertheless, Samsung is confident that its advantage in EUV lithography processes and its large-scale production capacity will eventually tip the balance in its favor.
A market where Samsung is lagging behind
In the HBM segment, South Korean firm has lost momentum. SK hynix clearly leads, having been the first to deliver HBM4 samples to NVIDIA in March 2025, followed by Micron in June. Samsung, according to estimates, began shipments around July but remains behind its competitors.
Even more telling is that Samsung has not yet completed validation of its HBM3E, while rivals have done so months ago. However, industry consensus suggests that HBM3E is just an intermediate phase; the real “battle” will be fought with HBM4, and that’s where Samsung aims to re-engage.
The price card: prepared for an open war
In addition to increasing sample volumes, Samsung is preparing what some experts call a “desperate move”: lowering prices to the limit, even with minimal margins, to secure a place among NVIDIA’s suppliers.
The landscape is complex. Producing HBM4 is more expensive than HBM3E, due to factors such as outsourcing the base die, reducing wafers per batch, and increasing I/O terminals. Sector estimates suggest that a 12-layer HBM4 could cost between 60% and 70% more than an equivalent HBM3E.
Current leader SK hynix aims to maintain a 30% to 40% premium on selling prices. Meanwhile, NVIDIA is pushing to cut costs, aware that its suppliers are competing fiercely in a market where the company has the strongest voice. This divergence has delayed negotiations with SK hynix and opens the door for Micron and Samsung to position themselves as alternatives.
In this context, Samsung plans to offer much lower prices, with premiums below 20%, which practically implies accepting margins almost close to zero.
A play reminiscent of the DRAM “price war”
This isn’t Samsung’s first time adopting such a strategy. In the past, during downturns in the DRAM market, the company aggressively competed, driving Japanese and Taiwanese manufacturers out of the game.
With stronger financial resources and installed capacity, Samsung managed to survive that “those who endure win” dynamic, emerging reinforced while rivals couldn’t sustain the losses.
Now, industry analysts see clear parallels. “Samsung can afford to sacrifice margins in the short term, something SK hynix or Micron cannot do as comfortably,” industry insiders explain. The question is whether this approach will work again in such a highly concentrated market like HBM.
Political factors and high-level contacts
Internal support is also decisive. According to South Korean sources, Samsung’s offensive has the direct backing of its president, Lee Jae-yong, and is led by Jun Young-hyun, vice president of the company and head of the semiconductor division.
The relationship with NVIDIA is being carefully managed. In a recent trip to the United States, Lee met with Jensen Huang, NVIDIA CEO, in an encounter that included a symbolic gesture: both executives were photographed embracing, signaling a strong rapport that did not go unnoticed in global industry circles.
The message is clear: Samsung is ready to put everything on the line to become a strategic supplier for NVIDIA in the era of artificial intelligence.
Risks and consequences of the strategy
However, this bold move is far from risk-free. The main concerns include:
- Compromised profitability: Selling HBM4 with minimal margins could hurt the bottom line if quick results aren’t achieved.
- Fierce competition: SK hynix holds an advantage, and Micron is also aiming to secure a foothold, which could turn the market into an unsustainable price war.
- Dependence on NVIDIA: Focusing efforts on a single client may leave Samsung vulnerable if circumstances change.
- Technical challenges: The adoption of immature sixth-generation DRAM might slow product validation.
If the gamble pays off, Samsung could reinstate itself among the HBM elite and secure contracts with the world’s most influential GPU manufacturer.
Conclusion: gamble to win?
Samsung finds itself at a crossroads. It can either regain ground in the HBM4 segment through an aggressive production and pricing strategy or risk falling behind SK hynix, which currently leads comfortably, and Micron, which is steadily advancing.
The South Korean company seems to have chosen the former: to risk everything, including profitability, to re-establish itself as a key player. History shows Samsung can endure these attritional battles, but the global AI landscape is far more complex than the DRAM wars of a decade ago.
The outcome will depend on whether its HBM4 offensive convinces NVIDIA and finally allows it to enter a supply chain that currently dictates who wins and who loses in the era of artificial intelligence.
Frequently Asked Questions (FAQ)
What is HBM memory, and why is it so important for AI?
HBM (High Bandwidth Memory) is a high-performance, 3D-stacked memory that offers significantly higher bandwidth than conventional DRAM. It is critical for AI chips and cutting-edge GPUs, where data access speed is vital.
Why is NVIDIA the key customer in this market?
NVIDIA accounts for most HBM demand, as its AI data center GPUs like H100 and the upcoming Rubin series rely on this memory to maximize performance. Securing a contract with NVIDIA means access to large volumes and revenue stability.
What’s the difference between HBM3E and HBM4?
HBM3E is an enhanced version of the third generation of HBM, while HBM4 corresponds to the sixth generation of DRAM and represents a significant leap in density, efficiency, and performance. Its adoption will set the standard for the next wave of AI chips.
What risks does Samsung face with this strategy?
The main risk is financial: lowering margins to compete on price may hurt profitability. Additionally, technical challenges with immature DRAM 1c and convincing NVIDIA that its HBM4 is competitive against SK hynix and Micron pose hurdles.
via: thebell.co.kr