Samsung's 8-layer HBM3E chips clear Nvidia's tests for use in AI processors

INSUBCONTINENT EXCLUSIVE:
The company has since reworked its HBM3E design to address those issues, according to the sources who were briefed on the matter | (Photo:
Reuters)3 min read Last Updated : Aug 07 2024 | 8:43 AM IST A version of Samsung Electronics' fifth-generation high bandwidth memory (HBM)
chips, or HBM3E, has passed Nvidia's tests for use in its artificial intelligence (AI) processors, three sources briefed on the results
a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start
is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and
reduce power consumption
A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex
approval follows Nvidia's recent certification of Samsung's HBM3 chips for use in less sophisticated processors developed for the Chinese
become the mainstream HBM product in the market this year with shipments concentrated in the second half, according to research firm
TrendForce
SK Hynix, the leading manufacturer, estimates demand for HBM memory chips in general could increase at an annual rate of 82% through
breakdowns for specific chip products
Samsung's total DRAM chip revenue was estimated at 22.5 trillion won ($16.4 billion) for the first six months of this year, according to a
customer it declined to identify
picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated
feed.)First Published: Aug 07 2024 | 8:43 AMIST