<p>➀ Micron has commenced shipments of HBM4 memory, delivering 2.0TB/s per stack with a 2048-bit interface, a 60% performance boost over HBM3E;</p><p>➁ Initial 36GB stacks target next-gen AI accelerators, built on Micron's 1-beta process with advanced memory testing (MBIST) for reliability;</p><p>➂ Full production ramp is planned for 2026, aligning with next-gen AI hardware releases, while future designs may combine HBM with LPDDR for expanded memory capacity.</p>
Related Articles
- Micron confirms memory price hikes as AI and data center demand surges7 months ago
- Consumer memory slowing more than AI gaining10 months ago
- SK hynix Produces HBM4 Faster than JEDEC Specs Entering Mass Productionabout 1 month ago
- Hynix ready for mass production of HBM4about 1 month ago
- Nvidia Rubin CPX forms one half of new, "disaggregated" AI inference architecture — approach splits work between compute- and bandwidth-optimized chips for best performanceabout 1 month ago
- 2 Of The Market's Most Popular Stocks I Wouldn't Dare Buy Right Nowabout 1 month ago
- SK hynix dethrones Samsung to become world's top-selling memory maker for the first time — success mostly attributed to its HBM3 dominance for Nvidia's AI GPUs2 months ago
- Q2 wafer shipments up 9.6% y-o-y3 months ago
- Three SSDs for AI datacentres3 months ago
- Raja Koduri's mission to deliver high bandwidth memory for GPUs has taken a turn that could enable 4TB of VRAM on AI cards — joins Sandisk to advise on SSD tech that could feed AI accelerators3 months ago