Micron Begins Shipping HBM4 Memory: 36 GB Capacity and Over 2 TB/s for the New Era of AI

Micron Technology has made a significant technological breakthrough in the memory sector by announcing the start of sample shipments of its next-generation HBM4 memory to key customers, offering capacities of 36 GB and a bandwidth exceeding 2 TB/s per stack. This advancement marks an important milestone in the race to accelerate next-generation artificial intelligence (AI) platforms, where efficient management of vast amounts of data and processing speed are critical factors.

A Key Advancement for Generative AI

Micron’s HBM4, based on the 1ß (1-beta) DRAM manufacturing process and its advanced “12-high” packaging technology, is designed for the most demanding AI applications in the market. The presented model—a stack of 12 chips with 36 GB of capacity—utilizes a 2,048-bit interface, achieving speeds exceeding 2.0 TB/s and outperforming the previous generation (HBM3E) by more than 60%.

These capabilities enable AI accelerators to execute complex inferences and reasoning faster, facilitating data communication and optimizing the execution of large language models and cutting-edge AI systems.

Improved Energy Efficiency and Higher Capacity

One of the key highlights of Micron’s HBM4 is its energy efficiency, which improves by more than 20% compared to the previous generation. This aspect is crucial in the current context, where reducing electricity consumption is essential for sustainability and controlling operational costs in data centers. The company emphasizes that its solution not only delivers higher performance but also maximizes overall system efficiency.

An Accelerator for Innovation Across Multiple Sectors

The introduction of HBM4 memory is particularly relevant at a time when generative AI is expanding into new uses and sectors, from healthcare and finance to transportation and scientific research. According to Micron, its new technology will accelerate result generation, enhance the reasoning capability of systems, and help artificial intelligence provide tangible value to society.

Roadmap Toward 2026

Micron plans to ramp up HBM4 production throughout 2026, aligning with the rollout of the upcoming AI platforms of its major customers. With nearly five decades of innovation in memory and storage, the company solidifies its role as a strategic partner for the AI industry and one of the key drivers of technological advancement in data centers and edge applications.

A Future Dominated by High Capacity and Speed

With this launch, Micron strengthens its position as a leader in advanced memory, in a context where the demand for capacity, speed, and efficiency for AI continues to grow. HBM4 memory will be a critical component in the new AI accelerators, enabling companies and research centers to tackle increasingly complex challenges with solutions that are not only powerful but also more sustainable.

Source: Micron

Scroll to Top