According to analysts, Nvidia Corporation NVDA controls roughly 80% of the AI chip market, when including the custom processors built by companies such as Alphabet GOOG-owned Google and Microsoft Corporation MSFT. While Nvidia reported its latest reported quarter revenue tripled with its net income income rising to $9.24 billion, its rivals are pulling out the big guns in their effort to catch up. This week, Advanced Micro Devices Inc AMD unveiled new AI chips that promise to challenge its dominance, while also lifting its market value forecast for AI chips to $400 billion by 2027. Then there’s the e-commerce and cloud giant, Amazon.com Inc AMZN that recently unveiled next-generation AWS-designed chips with which it will be challenging the early-start advantage in generative AI that Microsoft enjoys.
AMD Is Aiming High With Its Two New Chips
As it launched a new generation of AI chips on Wednesday, AMD also provided an estimate of a $45 billion addressable market for its own data center artificial intelligence processors this year, rising from its June estimate of $30 billion. The new AI data center chips are from the MI300 lineup, with one being designed for generative AI applications and the second for supercomputers. The MI300 series is positioned to challenge Nvidia’s flagship AI processors.
Amazon Is Going After Microsoft With New AI Chip And Deepened Nvidia Partnership
During its AWS re:Invent event, Amazon announced its new Trainium2 artificial intelligence chip and the general-purpose Graviton4 processors that consume less energy compared to chips from AMD and Intel Corporation INTC while also promising a 30% performance improvement from its existing Graviton3 while being twice as energy efficient.
But Amazon is going for a dual approach as it aims to sell both Amazon-branded products as well as from others, such as the in-demand GPUs from Nvidia. Amazon is trying to challenge Microsoft by making AWS stand out as a cloud provider with a variety of cost-effective options. By deepening its partnership with Nvidia, Amazon will also be offering access to its next-generation H200 Tensor Core GPUs. But Microsoft also opted for a similar approach as along with unveiling its inaugural AI chip, the Maia 100, it also revealed that through Azure cloud, it will be offering access to Nvidia’s H200 GPUs. The new GPU from Nvidia is an upgrade from the H100 which the Microsoft-backed Open AI used to train its most advanced large language model, GPT-4.
By the looks of it, it seems that AMD has finally come up with a strategy to catch up to Nvidia, and Amazon’s new cloud chip could be the answer to its intense rivalry with Microsoft.
DISCLAIMER: This content is for informational purposes only. It is not intended as investing advice.
This article is from an external contributor. It does not represent Benzinga's reporting and has not been edited for content or accuracy.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.