Nvidia Corp NVDA has successfully tested and approved Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, known as HBM3E, for use in its artificial intelligence (AI) processors.
Samsung and Nvidia will likely forge a supply agreement for the newly qualified eight-layer HBM3E chips, with supplies anticipated to begin by the fourth quarter of 2024, Reuters reports.
Also Read: Huawei And Baidu Increase Samsung HBM Purchases As US Curbs Loom
Samsung’s 12-layer HBM3E version is still undergoing Nvidia’s testing.
HBM is an essential component of graphics processing units (GPUs) for AI, helping to process vast amounts of data from complex applications.
Samsung anticipates that HBM3E chips will account for 60% of its HBM chip sales by the fourth quarter.
Analysts estimate Samsung’s total DRAM chip revenue to be 22.5 trillion won ($16.4 billion) for the first six months of this year, with approximately 10% potentially derived from HBM sales.
Prior reports indicated that Nvidia approved using Samsung’s fourth-generation HBM3 chips in its AI processors for the Chinese market.
The HBM market is expected to grow from $4 billion in 2023 to $71 billion in 2027, Bloomberg cites Morgan Stanley.
SK Hynix will maintain its leadership in the HBM market in 2024, holding over 52% market share, Nikkei Asia cites TrendForce. Meanwhile, Samsung is projected to follow closely with 42.4%, and Micron Technology, Inc MU is expected to secure just over 5% of the market.
Price Action: NVDA shares traded lower by 0.52% at $103.71 at last check on Wednesday.
Also Read:
Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors.
Image via Shutterstock
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.