On Wednesday, Feb. 28, micron.com will be upgraded between 6 p.m. - 12 a.m. PT. During this upgrade, the site may not behave as expected and pages may not load correctly. Thank you in advance for your patience.

HBM3E

Industry’s fastest, highest-capacity HBM to advance generative AI innovation

+

Advancing the rate of AI innovation

boy with glasses

Generative AI

Generative AI opens a world for new forms of creativity and expression, like the image above, by using large language model (LLM) for training and inference. Utilization of compute and memory resources make the difference in time to deploy and response time. Micron HBM3E provides higher memory capacity that improves performance and reduces CPU offload for faster training and more responsive queries when inferencing LLMs such as ChatGPT.

+
Deep learning

Deep learning

AI unlocks new possibilities for businesses, IT, engineering, science, medicine and more. As larger AI models are deployed to accelerate deep learning, maintaining compute and memory efficiency is important to address performance, costs and power to ensure benefits for all. Micron HBM3E improves memory performance while focusing on energy efficiency that increases performance per watt resulting in lower time to train LLMs such as GPT-4 and beyond.

+
High-performance computing (HPC)​

High-performance computing

Scientists, researchers, and engineers are challenged to discover solutions for climate modeling, curing cancer and renewable and sustainable energy resources. High-performance computing (HPC) propels time to discovery by executing very complex algorithms and advanced simulations that use large datasets. Micron HBM3E provides higher memory capacity and improves performance by reducing the need to distribute data across multiple nodes, accelerating the pace of innovation.


+
1 Data rate testing estimates based on shmoo plot of pin speed performed in manufacturing test environment
2 50% more capacity for same stack height
3 Power and performance estimates based on simulation results of workload uses cases

4 Based on internal Micron model referencing an ACM Publication, as compared to the current shipping platform (H100)
5 Based on internal Micron model referencing Bernstein’s research report, NVIDIA (NVDA): A bottoms-up approach to sizing the ChatGPT opportunity, February 27, 2023, as compared to the current shipping platform (H100)
6 Based on system measurements using commercially available H100 platform and linear extrapolation

Featured resources

Micron delivers industry's fastest, highest capacity HBM3E to advance generative AI innovations

1.2 TB/s bandwidth, 8-high 24GB HBM3E from Micron delivers superior power efficiency enabled by advanced 1β process node.

Read HBM3E press release >
+

The Six Five Insider with Micron's Girish Cherussery

Micron's Girish Cherussery, Sr. Director, High Performance Memory, sits down with Patrick Moorhead and Danial Newman from Six Five to discuss High Bandwidth Memory (HBM) and Micron's newest HBM3E product.

View video on HBM3E >
+

Introducing memory built for AI innovation

We are in the dawn of the era of artificial intelligence (AI), where AI is expected to be a central part of our everyday lives. This fueled by advances in compute and memory technologies. High bandwidth memory (HBM) is at the forefront of AI innovations.
Read HBM3E Product Brief >
+

1β DRAM Technology

Micron is shipping the industry’s first DRAM manufactured on next-generation 1-beta process technology. It represents state-of-the-art innovation from Micron’s continued investment in R&D and process technology advancement. Micron’s 1-beta process technology allows development of memory products with increased performance, greater capacity, higher density, and lower relative power consumption than prior generations.

Learn more >
+

Frequently asked questions

How is HBM3E different from HBM3 Gen2?
There is no difference. It is just a name change.
What is the data rate of HBM3E?
Micron’s HBM3E delivers an industry leading pin speed of > 9.2Gbps and can support backward compatible data rates of HBM2 first generation devices.
What is the bandwidth of HBM3E?
Micron’s HBM3E delivers an industry leading Bandwidth of >1.2 TB/s per placement. HBM3E has 1024 IO pins and Micron’s pin speed > 9.2Gbps achieves > 1.2TB/s​.
What is the capacity of HBM3E?
Micron’s industry leading HBM3E provides 24GB capacity per placement with an 8-high HBM3E, Micron plans to announce a 36GB 12-high HBM3E device in the future.​ ​
How fast is HBM3E memory?
Micron’s HBM3E delivers an industry leading Bandwidth of >1.2 TB/s per placement. HBM3E has 1024 IO pins and Micron’s pin speed > 9.2Gbps achieves > 1.2TB/s.​
What is the difference between HBM2 and HBM3E?​
HBM2 offers 8 independent channels running at 3.6Gbps per pin and delivering up to 410GB/s bandwidth. HBM2 offers 4GB, 8GB and 16GB of capacity. HBM3E offers 16 independent channels and 32 psuedo channels. Micron’s HBM3E delivers pin speed > 9.2Gbps at an industry leading Bandwidth of >1.2 TB/s per placement. Micron’s HBM3E offers 24GB using 8-high stack, and a 36GB using 12-high stack is planned for the future.
What are HBM3E specifications?
Please see our Product Brief.
+