Zuckerberg's Meta Launches MTIA Chip To Rival Nvidia's AI Offerings In Cloud Business: Here's What You Need To Know

Meta Platforms Inc. META has just revealed its next-generation AI chip, the MTIA, which is designed to compete with Nvidia Corporation’s NVDA AI chips in the cloud business.

What Happened: On Wednesday, Meta introduced the company’s next-generation custom-made chips designed for AI workloads. The new Meta Training and Inference Accelerator or MTIA chip, an upgrade over the current MTIA v1, is manufactured on Taiwan Semiconductor Manufacturing Co’s 5nm process node. The first generation MTIA had TSMC’s 7nm process node.

Meta’s MTIA chip is the company’s first-generation AI inference accelerator, designed in-house for Meta’s AI workload. The chip’s architecture is focused on providing the right balance of compute, memory bandwidth, and memory capacity for serving ranking and recommendation models.

See Also: Sundar Pichai’s Google Grapples With AI Mishaps Amid Worker Criticism, Yet Holds Title Of America’s Most Innovative Company

Meta’s long-term goal is to provide the most efficient architecture for its unique workloads. While offering details about its next-generation MTIA chip, the tech giant said that as AI workloads become increasingly important for Meta’s products and services, the efficiency of its MTIA chips will improve its ability to provide the best experiences for its users across the planet.

“This new version of MTIA more than doubles the compute and memory bandwidth of our previous solution while maintaining our close tie-in to our workloads,” Meta stated, adding, “It is designed to efficiently serve the ranking and recommendation models that provide high-quality recommendations to users.”

Subscribe to the Benzinga Tech Trends newsletter to get all the latest tech developments delivered to your inbox.

The announcement comes after reports surfaced about TSMC, a key supplier for both Meta and Nvidia, receiving significant funding from the U.S. government for its Arizona plants. This move is seen as a strategic step to reduce reliance on Asian suppliers and strengthen the U.S. semiconductor industry.

Why It Matters: The AI chip industry has been witnessing a shift in dynamics, with major tech companies like Meta, Amazon.com, Inc., Alphabet Inc., and Microsoft Corporation developing their own AI chips, challenging Nvidia’s market dominance.

In January earlier this year, it was reported that several big tech giants relying heavily on Nvidia's specialized Graphics Processing Units (GPUs) for their AI development are actively working toward reducing this dependency.

In fact, in October 2023, it was reported that ChatGPT-parent OpenAI has been considering developing its own AI chips. Moreover, the EU has also reportedly been looking closely at the AI chips market to find out if there are potentially abusive practices. The union has a particular focus on major corporations like Nvidia because of its dominance in the market.

Check out more of Benzinga's Consumer Tech coverage by following this link.

Read Next: Elon Musk Hit With Obstruction Of Justice Inquiry After He Calls For Impeachment Of Brazil’s Top Judge: ‘Abuse Of Economic Power’

Disclaimer: This content was partially produced with the help of Benzinga Neuro and was reviewed and published by Benzinga editors.

Image Credits – Shutterstock

Market News and Data brought to you by Benzinga APIs
Comments
Loading...
Posted In: NewsTechAI Chipsartificial intelligencebenzinga neuroConsumer TechMeta Training and Inference AcceleratorMTIA
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!