Samsung Electronics is heading toward some of the best quarters in its recent history, driven by the memory business amidst a feverish boom in data centers and Artificial Intelligence. In its preliminary estimates for the fourth quarter of 2025, the company reported consolidated sales around 93 trillion won (within a range of 92–94 trillion) and operating profit roughly at 20 trillion won (between 19.9–20.1 trillion). This figure represents a significant leap that once again highlights an uncomfortable reality for the rest of the industry: DRAM has become a “strategic” component, and its pricing is stressing the entire supply chain.
The market’s interpretation is straightforward: when memory prices go up, everything else—either rises or gets compressed. In the PC and mobile sectors, this impacts profit margins; in data centers, it translates into budgets being rewritten on the fly; and in AI, where the race for performance never stops, the sector seems more willing to pay this “toll” than the end consumer.
DRAM: From Invisible Component to Critical Asset
For years, RAM was considered a cost within the bill of materials (BOM) for many manufacturers. Today, that has changed due to two main reasons: demand and production priorities.
On the demand side, infrastructure for AI — training, inference, real-time analytics — has skyrocketed the appetite for memory. Not just in volume, but also in type: alongside conventional DRAM (DDR for servers and PCs), HBM (High Bandwidth Memory) has become the fuel for AI accelerators. And here, the market is even more sensitive: limited capacities, complex supply chains, and long-term contracts negotiated over years, not quarters.
On the supply side, major memory companies are adjusting their mix to maximize profitability and serve the hottest segment: AI and data centers. This doesn’t mean abandoning the PC or mobile markets, but rather reordering priorities at a time when increasing industrial capacity isn’t immediate.
What’s Going On With Prices?
The recurring market data shows that DRAM prices have risen sharply. Different price tracking services and sector analyses have described increases that, in some cases, multiply several times the levels from the previous year in certain categories and contracts. Additionally, analysts expect that the first quarter of 2026 could continue on this trajectory: TrendForce, for example, anticipates that contract prices for conventional DRAM will increase by 55% to 60% quarter-over-quarter.
The key isn’t just the “percentage”: it’s the domino effect. If a brand pays more for memory for a lightweight laptop or a general-purpose server, that increase ends up competing with other costs (panel, battery, CPU/GPU, storage, logistics) and pressures the final pricing strategy. In consumer markets, the user sets the limit. In AI, it often depends on the urgency to deploy capacity.
Samsung Wins (and the Rest Recalculates)
In this context, it’s not surprising that Samsung forecasts a quarter with operating profit close to 20 trillion won. The market has interpreted these figures as a signal that memory is once again the profit engine, especially when the cycle aligns.
However, the message for the sector is twofold:
- For PC and mobile manufacturers, the risk is that memory will raise costs for mid-tier and “AI-ready” ranges, just as the industry tries to sell added value (NPU, new platforms, localized AI experiences) without inflating prices.
- For traditional data centers, the price increase adds pressure to investments already stressed by energy, cooling, GPUs, and networking costs.
- For AI operators, memory is increasingly viewed as part of the business’s structural cost: if the return (productivity, automation, services) justifies it, it will be paid; if not, investments are paused.
2026, Where the “Cost of Intelligence” Also Depends on Memory
The usual AI narrative has focused on GPUs and power consumption. But the reality is that memory — and its availability — is central to both performance and cost: without sufficient capacity and bandwidth, systems won’t scale as promised by brochures. When the market enters a tension phase, memory stops being a commodity and becomes a competitive advantage.
If the anticipated further increases in contractual prices materialize, 2026 could cement an uncomfortable idea: the era of AI isn’t just paid for with compute chips, but also in gigabytes.
Frequently Asked Questions
Why are DRAM prices rising so much, and how does this impact AI?
Because demand for servers and accelerators for AI grows faster than available capacity, with some production allocated to more profitable segments. Additionally, in AI, memory (capacity and bandwidth) directly impacts performance and operational costs.
What’s the difference between “conventional” DRAM and HBM in data centers?
Conventional DRAM (like DDR5) is used as main memory for CPUs and servers; HBM is designed for GPUs/accelerators and offers very high bandwidth, essential for efficiently training and inference of large models.
Will DRAM prices increase for laptops and PCs because of this?
They might pressure prices or margins, especially in mid-range segments where every euro counts. The final impact depends on contracts, inventory, competition, and whether manufacturers choose to absorb or pass on the cost.
Which indicators should be watched to see if prices will keep rising?
Quarterly guidance from memory manufacturers, contractual price reports (DRAM and NAND), and demand forecasts for servers/AI (capex from hyperscalers, data center deployments, and PC refresh cycles).

