For years, Nvidia’s GPUs have been the undisputed engine of the AI boom. But as the technology evolves, a different chip is stepping into the spotlight. At its GTC conference this week, Nvidia is expected to detail a fresh focus on the central processing unit, or CPU—the traditional brain of a computer that it once seemed destined to play second fiddle.
The shift is driven by the rise of agentic AI. These systems don't just answer questions; they perform complex, multi-step tasks, requiring sophisticated coordination and data movement. That work demands powerful general-purpose compute, which is precisely a CPU's strength. "These agentic systems are spawning different agents working as a team," CEO Jensen Huang noted recently, highlighting the explosion in data tokens that need management.
Nvidia’s CPU journey began with its Grace chip, announced in 2021. Its successor, Vera, is now in production. While often paired with Nvidia's own GPUs in full systems, these CPUs are also going solo. A major deal with Meta involves deploying racks of Grace, and later Vera, CPUs independently. Thousands already power supercomputers at national labs.
The entire CPU market is heating up. Bank of America projects it could surge from $27 billion last year to $60 billion by 2030. Analysts warn of a "quiet supply crisis," with lead times stretching to six months and prices rising. AMD and Intel have reported unprecedented demand.
Nvidia’s approach is distinct. Its CPUs, built on Arm architecture, are engineered not as general-purpose workhorses but as specialized partners for its GPUs. "You're trying to make sure that that very expensive resource, the GPU, isn't sitting there waiting," explained Dion Harris, Nvidia’s head of AI infrastructure. This contrasts with the high-core-count designs from Intel and AMD aimed at different cost metrics.
Despite pushing its own silicon, Nvidia is adopting a pragmatic, ecosystem-friendly stance. It has opened its NVLink technology to competitors like Intel and Qualcomm, and even supports the open-source RISC-V architecture. The strategy, as analyst Ben Bajarin puts it, is becoming "soup-to-nuts"—offering a complete suite of solutions for an industry where AI workloads are rapidly diversifying.
Source: CNBC