The semiconductor economy has moved beyond being a “tech thermometer” and has become the primary engine of the new digital infrastructure era. According to preliminary estimates published by Gartner in January 2026, global industry revenue reached $793.449 billion in 2025, a 21% increase compared to the previous year. This figure not only confirms that chips are experiencing their best period in decades: it also clearly shows who is setting the pace. The explanation, according to Gartner, can be summarized in two letters that now influence everything: AI.
The consulting firm attributes the growth to the accelerated expansion of Artificial Intelligence data centers and, especially, to the demand for three types of components: AI processors, high-bandwidth memory (HBM), and network chips. Collectively, these “AI semiconductors” accounted for nearly one-third of total sector sales in 2025. And the trend, far from slowing down, appears to be intensifying: Gartner projects that AI infrastructure spending will exceed $1.3 trillion in 2026, increasing pressure on the supply chain, manufacturing capacity, and, of course, the list of leading suppliers.
NVIDIA Breaks the Symbolic $100 Billion Barrier
On this landscape, NVIDIA emerges as the most obvious winner. Gartner estimates that the company’s semiconductor revenue reached $125.703 billion in 2025, with a market share of 15.8%. This figure is significant for two reasons: first, because it would position NVIDIA as the first supplier to surpass $100 billion in chip sales; second, because its growth has been so substantial that it widened the gap with Samsung by $53 billion.
The market insight is straightforward: accelerators for training and inference—and the ecosystem built around them—have become the core of investment in data centers. It’s not just about selling GPUs; it’s about selling a complete platform that influences purchases of memory, networking, interconnects, and infrastructure equipment.
Memory Regains Control, but Not Just Any DRAM Will Do
In second place is Samsung Electronics, with $72.544 billion and a 9.1% market share. Gartner notes that its performance was supported by its memory business, which grew by 13% in that division, although there was a decline in non-memory segments. The core message is that this “boom” is not a traditional PC or mobile cycle: it is a cycle of specialized memory for AI.
The rise of SK hynix reinforces this trend. The South Korean firm climbs to third place with $60.640 billion and a growth of 37.2%, driven by demand for HBM in AI servers. This is no minor detail: Gartner estimates that HBM already accounted for 23% of the DRAM market in 2025, surpassing $30 billion in sales. In other words, “premium” memory for AI is no longer just a product line; it’s a segment that reshapes the whole business.
This phenomenon also explains Micron’s jump to fifth place, with $41.487 billion and a growth of 50.2%, another sign that the market is rewarding those who can deliver capacity and performance where AI demands it.
Intel Under Pressure: Market Share Halved Since 2021
Meanwhile, Intel stands out as the major exception in the growth picture. According to Gartner, the company closed 2025 with $47.883 billion, a decline of 3.9%, and ended the year with a 6% market share, roughly half of what it had in 2021. This data summarizes years of change: the relative importance of traditional CPUs is decreasing as massive investments shift toward acceleration, advanced memory, and networks—areas where other players have captured the investment cycle.
In parallel, the industry landscape appears more “polycentric”: Qualcomm ($37.046 billion), Broadcom ($34.279 billion), and AMD ($32.484 billion) establish relevant positions, while Apple ($24.596 billion) and MediaTek ($18.472 billion) maintain their presence in silicon for proprietary devices and platforms. The map confirms that chips are no longer defined solely by “more”—but by “the right” for AI: compute, memory, interconnection, and energy efficiency.
Top 10 Worldwide Semiconductor Revenues (2025)
| 2025 Rank | Company | 2025 Revenue (M$) | Market Share 2025 |
|---|---|---|---|
| 1 | NVIDIA | 125,703 | 15.8% |
| 2 | Samsung Electronics | 72,544 | 9.1% |
| 3 | SK hynix | 60,640 | 7.6% |
| 4 | Intel | 47,883 | 6.0% |
| 5 | Micron Technology | 41,487 | 5.2% |
| 6 | Qualcomm | 37,046 | 4.7% |
| 7 | Broadcom | 34,279 | 4.3% |
| 8 | AMD | 32,484 | 4.1% |
| 9 | Apple | 24,596 | 3.1% |
| 10 | MediaTek | 18,472 | 2.3% |
(Figures in millions of dollars; global share based on a total of $793.449 billion.)
Beyond Rankings: the “AI Stack” Pulls the Entire Chain
Gartner highlights a key point that helps explain why growth is no longer just cyclical: in 2025, AI processors exceeded $200 billion in sales. When such a level of expenditure becomes structural, it pulls everything along: networking, storage, packaging, testing, and especially manufacturing capacity. It’s no surprise that HBM memory has become the sector’s most talked-about bottleneck: its manufacturing complexity, performance requirements, and need for advanced packaging position it as a critical piece of the “puzzle”.
Indeed, the pressure to scale is not just about producing more chips, but about producing them with performance, reliability, and consistency. And here comes another battle: investing in new manufacturing plants, advanced packaging, and process optimization—areas where Asia continues to lead, while the US and Europe seek to gain industrial resilience.
With this context, Gartner offers a prediction that could serve as the title for the next chapter: if the current roadmap holds, AI semiconductor sales could represent more than 50% of total industry revenues by 2029. The semiconductor industry, therefore, is heading toward a market where AI is not “just another segment,” but the key criterion that determines budgets, production capacity, and global leadership.
Frequently Asked Questions
What does it mean that AI semiconductors already account for nearly one-third of the market?
It indicates that the industry’s growth increasingly depends on components designed specifically for AI data centers (accelerators, HBM, and networks), rather than traditional PC or smartphone cycles.
Why is HBM memory so important in 2025 and 2026?
Because it feeds AI accelerators with much higher bandwidth than conventional DRAM and has become a limiting factor for scaling training and inference servers.
How will the infrastructure spending on AI exceeding $1.3 trillion in 2026 affect companies and governments?
It will intensify competition for chips, energy, manufacturing capacity, and talent, as well as increase supply chain pressures and influence geopolitical decisions on reindustrialization.
What does Intel’s market share decline reveal?
It reflects a shift in investment focus: less emphasis on general-purpose CPUs and more on acceleration, advanced memory, networks, and optimized AI platforms.
via: gartner

