Infineon / Mitsubishi / Fuji / Semikron / Eupec / IXYS

24GB HBM3 memory

24GB HBM3 memory

Posted Date: 2023-07-31

24GB HBM3 memory


It's constructed on Micron’s 1β (1-beta) DRAM course of node, which permits a 24Gb DRAM die to be assembled into an 8-high dice inside an industry-standard bundle dimension.

A 12-high stack with 36GB capability will start sampling in Q1 2024.

The system’s performance-to-power ratio and pin pace enhancements meet the intense energy calls for of  AI information centres.

The improved energy effectivity is feasible due to  advances resembling doubling the through-silicon vias (TSVs), thermal impedance discount via a five-time enhance in metallic density, and an energy-efficient information path design.

Micron is a accomplice in TSMC’s 3DFabric Alliance to assist form the way forward for semiconductor and system improvements.

As a part of the HBM3 Gen2 product improvement effort, the collaboration between Micron and TSMC lays the muse for a easy introduction and integration in compute techniques for AI and HPC design purposes.

TSMC has obtained samples of Micron’s HBM3 Gen2 reminiscence and is working  with Micron for additional analysis and assessments for the next-generation HPC software.

The Micron HBM3 Gen2 resolution addresses growing calls for on the planet of generative AI for multimodal, multitrillion-parameter AI fashions. With 24GB of capability per dice and greater than 9.2Gb/s of pin pace, the coaching time for big language fashions is diminished by greater than 30% and leads to decrease TCO.

Micron’s  providing unlocks a big enhance in queries per day, enabling educated fashions for use extra effectively. Micron HBM3 Gen2 reminiscence’s efficiency per watt drives  information centre price financial savings:

For an set up of 10 million GPUs, each 5 watts of energy financial savings per HBM dice is estimated to save lots of operational bills of as much as $550 million over 5 years.


View extra : IGBT modules | LCD displays | Electronic Components