Nvidia is moving in on Intel and AMD’s home turf
Nvidia (NVDA) on Tuesday announced an expanded, multi-year data center agreement with Meta (META) that will see the chipmaker supply the social media giant with millions of its Blackwell and Rubin GPUs.
And while that was certainly the splashiest part of the news, the companies said the agreement will also see Meta roll out Nvidia Grace CPU-only servers in its data centers, the first large-scale deployment of the chips.
Grace is the processor that Nvidia pairs with two Blackwell or two Blackwell Ultra GPUs to form its GB200 and GB300 AI superchips.
The Grace-only servers come at a time when Nvidia is angling to capitalize on the growing demand for traditional CPUs as hyperscalers increasingly look to the chips to help power some AI inferencing and agentic AI applications.
That spells trouble for Intel (INTC), which has long dominated the data center CPU space, and AMD (AMD), which is working to take market share from Intel.
“Nvidia has been on the path of providing more of the content in the data center for a while,” Gil Luria, managing director and head of technology at D.A. Davidson, told Yahoo Finance.
“The addition of Mellanox [a networking company Nvidia acquired in 2020] put them into the networking category as well. So when they sell into the data center, they’re actually selling almost a vast majority of the value. But it makes sense for them to increase that value even further by adding CPU capacity.”
Nvidia’s move couldn’t come at a worse time for Intel, which is dealing with capacity constraints that are keeping it from being able to produce enough CPUs to meet data center builders’ demand.
It’s not just data centers, though. Nvidia is also reportedly moving in on Intel and AMD’s consumer businesses with its own laptop chip, creating a whole new headache for the PC stalwarts.
Nvidia’s move toward selling CPUs doesn’t mean it’s giving up its massive GPU market advantage. Nor is it a sign that the AI GPU market is on its last legs. Rather, it’s about capitalizing on a growing trend in the AI industry toward using CPUs to power smaller AI models.
Gigantic AI models like the latest and greatest frontier models from OpenAI (OPAI.PVT), Google (GOOG, GOOGL), and Anthropic (ANTH.PVT) will still need the kind of horsepower only a GPU can provide. But CPUs are stealing a bit of the limelight back for those more petite models.
CPUs also stand as a bottleneck for the AI supply chain, one of many choke points in the ongoing AI build-out, which could hurt Nvidia’s sales over time. By bringing its own CPUs to the table, Luria said, Nvidia is doing what it can to keep sales flowing.