Meta to Announce Multi-Billion US Dollar Investment in Nvidia AI Chips for Research & Projects

Meta Allocates Multi-Billion US Dollars for Nvidia AI Chips

Meta has announced a multi-billion- US dollar investment in Nvidia’s AI chips, a cornerstone in AI research and development.

In an Instagram Reels post on Thursday, Meta CEO Mark Zuckerberg shed light on the company’s “future roadmap” for AI, emphasizing the need to construct a “massive compute infrastructure.” By the end of 2024, Zuckerberg revealed that this infrastructure would include a staggering 350,000 H100 graphics cards from Nvidia.

Although Zuckerberg did not disclose the number of graphics processing units (GPUs) already acquired, industry analysts estimate the cost per H100 unit to be between $25,000 and $30,000, potentially close to nearly $9 billion in expenditures if Meta secured them at the lower end of the price range.

Furthermore, Zuckerberg said that Meta’s compute infrastructure will boast “almost 600k H100 equivalents of compute if you include other GPUs “. In December 2023, tech giants like Meta, OpenAI, and Microsoft expressed their commitment to using the new AMD Instinct MI300X AI chip.

The substantial investment in these high-performance chips is integral to Meta’s pursuit of artificial general intelligence (AGI), which Zuckerberg described as the company’s “long-term vision. ” AGI is a futuristic form of AI that aspires to attain human-level intelligence, and it is also a field explored by OpenAI and Google’s DeepMind unit.

Meta’s Chief Scientist, Yann LeCun, emphasized the pivotal role of GPUs in AGI research, stating, “[If] you think AGI is in, the more GPUs you have to buy.” Regarding Nvidia CEO Jensen Huang, LeCun said, “There is an AI war, and he’s supplying the weapons. “. “

Meta’s AI Chips Future Investment

In Meta’s third-quarter earnings report, the company projected total expenses for 2024 to be between $94 billion and $99 billion, driven partly by the expansion of computing resources. During the call with analysts, Zuckerberg affirmed that AI would be their “biggest investment area in 2024, both in engineering and computer resources.”

Addressing concerns about responsible development, Zuckerberg announced Meta’s intention to “open source responsibly” it’s yet-to-be-developed “general intelligence.” This approach aligns with Meta’s Llama family of large language models strategy.

In a subsequent post, Meta’s Chief Scientist, Yann LeCun, clarified that to expedite progress, the Fundamental AI Research team (FAIR) is now a sister organization of GenAI, the AI product division.

WRITTEN BY

Team Eela

TechEela, the Bedrock of MarTech and Innovation, is a Digital Media Publication Website. We see a lot around us that needs to be told, shared, and experienced, and that is exactly what we offer to you as shots. As we like to say, “Here’s to everything you ever thought you knew. To everything, you never thought you knew”
0

Leave a Reply

Your email address will not be published. Required fields are marked *