Nvidia (NVDA) is the global AI chip leader, but word that Google (GOOG, GOOGL) could sell some of its own AI chips to Meta (META) has raised concerns that one of its biggest clients is becoming a major competitive threat.
According to a Nov. 24 report by the Information, Google’s deal with Meta could be worth billions of dollars.
That sent Nvidia’s stock price down 2.5% the day after the report. Nvidia also released a statement on X saying that it was happy for Google’s success, but that Nvidia’s own chips are a generation ahead of the search giant’s offerings.
On Tuesday, Amazon (AMZN) announced the public availability of its Tranium 3 chip, saying that it can save up to 50% on training costs for AI software compared to alternatives.
But Google potentially selling its TPUs (tensor processing units) and Amazon providing access to its Tranium 3 chips doesn’t necessarily mean that Nvidia is at risk of losing its lead in the AI race anytime soon.
It does, however, mean that Google and Amazon could end up taking their own share of the broader market as the global AI build-out continues to expand and companies look for alternatives to Nvidia amid a rush for its AI chips.
One of the main things to understand about the Nvidia versus Google and Amazon debate is that they don’t exactly offer the same products. Google’s TPUs and Amazon’s Tranium 3 are types of chips called ASICs, or application-specific integrated circuits, meaning they’re built to accomplish specific tasks very well.
That means Google and Amazon have developed them to handle certain applications efficiently because the chips were made specifically for those purposes.
“[Google knows] the requirements and they know what trade-offs are most efficient for them,” explained Forrester senior analyst Alvin Nguyen.
“They can make something that works better today for them. Now, it doesn’t mean that it’s superior to Nvidia in every aspect. But … at least for Google, it will be superior for their needs,” he added.
Google CEO Sundar Pichai addresses the crowd during Google’s annual I/O developers conference in Mountain View, California on May 20, 2025. (Photo by Camille Cohen / AFP) (Photo by CAMILLE COHEN/AFP via Getty Images) ·CAMILLE COHEN via Getty Images
Nvidia’s chips, meanwhile, are available across multiple cloud platforms, including those from Google and Amazon, as well as Microsoft. The architecture that underpins the chips can also be transferred to different use cases, whether that’s training AI models, running models on robots, powering video games, or helping bring computing capabilities to self-driving car technologies.
Nvidia also has its own line of networking products that it sells to third parties including Amazon, which will use the company’s NVLink technology alongside its Tranium 4 and Graviton CPU chips in its servers, as well as its popular CUDA software.
Large hyperscalers, however, are able to put up the initial cash necessary to build their own custom chips and use them over time, amortizing some of the cost. Other companies, however, can’t afford to produce their own processors and rely on chips from Nvidia and rival AMD (AMD).
It’s important to note that those same hyperscalers also buy plenty of Nvidia chips. CFO Colette Kress noted during the company’s second quarter earnings that large cloud providers accounted for some 50% of Nvidia’s total data center revenue. The company didn’t mention that percentage in its latest report.
ASICs like Google’s and Amazon’s have one other limiting factor: If the companies change their workloads, they need to rework their chips.
“If your model structures change, you may need to design a new chip,” explained Bernstein analyst Stacy Rasgon, adding that’s why the companies also purchase those Nvidia chips.
But there’s a potential reason to take that risk.
NVIDIA CEO Jensen Huang introduces an “Industrial AI Cloud” project during a press conference in Berlin, Germany, November 4, 2025. REUTERS/Lisi Niesner TPX IMAGES OF THE DAY ·REUTERS / Reuters
“It’s about … total cost of ownership, performance per watt, performance per dollar,” Rasgon said. “And I’m willing to stipulate that for the workloads that they have designed for an ASIC, in theory, should have better TCO than a GPU or something that’s more general purpose. Otherwise, why are you bothering?”
Third-party companies like Meta (META), meanwhile, could benefit from using their rivals’ chips by simply getting access to more computing power at a time when the world is clamoring to get its hands on Nvidia’s products.
It’s not as though Nvidia is up against it, either. Kress has told investors and analysts the company has visibility toward $500 billion in Blackwell and Rubin AI chip revenue through calendar 2026. The company is currently in fiscal 2026.
And CEO Jensen Huang noted in the company’s most recent earnings announcement that “Blackwell sales are off the charts, and cloud GPUs are sold out.”
From a broader perspective, Nvidia’s chip lead and the outgrowth of competition from its own customers don’t mean the company is in danger of seeing revenue shrink.
According to Rasgon, the more likely scenario is that the AI chip market will continue to expand, making room for both Nvidia and other competitors.
Mizuho analyst Vijay Rakesh offered a similar sentiment, writing in a note to inventors that while a TPU deal between Google and Meta is positive for Broadcom, which builds Google’s chips, Nvidia is “still the king.”
That said, the chip industry is evolving at a rapid pace, and there’s no telling where it might go in the months and years ahead.
Sign up for Yahoo Finance’s Week in Tech newsletter. ·yahoofinance
Email Daniel Howley at dhowley@yahoofinance.com. Follow him on Twitter at @DanielHowley.
For the latest earnings reports and analysis, earnings whispers and expectations, and company earnings news, click here
Read the latest financial and business news from Yahoo Finance.