tiprankstipranks
Trending News
More News >
Advertisement
Advertisement

Meta’s Move for Google’s AI Chips Puts New Pressure on Nvidia

Meta’s Move for Google’s AI Chips Puts New Pressure on Nvidia

As reported earlier this week, Meta Platforms (META) is in talks to bring Google (GOOGL) Tensor Processing Units (TPUs) into its data centers starting in 2027. The plan would follow an earlier step in which Meta expects to rent Google chips via Google Cloud in 2026. The idea signals a move to expand its supply of high-end AI parts at a time when demand for these systems keeps rising. Nvidia’s (NVDA) stock slipped after the first report, as Meta is one of its largest buyers of advanced chips. Yet Meta is not moving away from Nvidia today. Instead, the company is looking to add a second source for its long AI plan.

TipRanks Black Friday Sale

Google’s Gain and Nvidia’s Risk

Google stands to gain new reach for its chip line if Meta goes ahead. The company has kept its tensor chips inside its own hubs for years. Now it aims to sell these chips directly to large clients. This shift would place Google in a more direct fight with Nvidia in the market for large AI clusters. It would also help Google spread the cost of its chip work across more buyers.

Nvidia would not lose its core spot in the market right now. The company still leads AI training with strong sales of its high-end Blackwell parts. Even so, Nvidia’s risk lies in future growth. Meta spends many billions each year on AI hubs. If more of that spend goes to Google chips in later years, Nvidia could lose part of its future sales path.

Datacenter sales now drive nearly all of Nvidia’s top line, which shows why any long-term shift by buyers like Meta matters for future growth.

How the Chips Compare

Naturally, a comparison between the two chips is in order; Blackwell and Google Ironwood aim at the same tasks but approach them in slightly different ways. Nvidia Blackwell offers strong speed, large memory, and wide support for many AI tools. It can train and serve most large models and works with nearly every major cloud group. It also offers added gains through its low-precision modes, which help with fast model runs at scale. To summarize in Jensen Huang’s own words, Nvidia’s Blackwell is “a generation ahead” of its industry’s rivals.

Google Ironwood also brings high speed and large memory, and it works in very dense chip pods. It can match Blackwell in some training jobs and offers strong gains in power use and cost. Yet Ironwood runs best with Google tools like TensorFlow and JAX. This helps Google tune its chips for large jobs but can limit how well they fit in mixed hubs that use many kinds of AI tools.

Alphabet’s broad revenue growth shows why Google is expanding its chip push, as rising AI demand across Search, YouTube, and Cloud supports its move to place Ironwood in more data hubs.

Why Meta Is Looking at Ironwood

To conclude, Meta wants faster growth in its AI plan and needs more chip supply to reach it. Nvidia gear remains in short supply, so Meta is looking at Google chips to add more seats at the table. Meta is also pushing its own custom chips, but those will take time to scale. Ironwood offers a near-term way to add more power at a clearer price.

In addition, Meta can place some of its large but steady tasks on Ironwood clusters. These tasks include search, feed rank, and ad rank. Since these jobs run in set ways, they can fit a more fixed chip line like Ironwood. Meanwhile, Meta can keep using Nvidia gear for broad model work that needs more flexible tools.

As a result, Meta is not switching camps. It is adding a second path to keep its AI plan on track as the size of its hubs continues to grow.

We used TipRanks’ Comparison Tool to align all three tech giants side by side, providing an in-depth perspective on each stock and the tech industry as a whole.

Disclaimer & DisclosureReport an Issue

1