Musk Calls Nvidia’s Rubin Chips a “Rocket Engine” for AI

Musk Calls Nvidia’s Rubin Chips a “Rocket Engine” for AI

ByGayane Tadevosyan
·3 min read

Elon Musk is known for blunt opinions, headline-grabbing moves, and outsized Silicon Valley ambitions. So when he publicly praises someone else’s product, it tends to turn heads.

That’s exactly why Nvidia (NVDA) is drawing so much attention right now.


CES 2026 showcased no shortage of ambitious tech, but Nvidia clearly stood apart with the debut of its new Rubin platform. Made up of six chips, Rubin is positioned as a rack-scale AI supercomputer—faster, cheaper, and more efficient than anything currently on the market.


The announcement quickly caught Musk’s attention. Posting on X, he called Rubin “a rocket engine for AI” and described Nvidia as the “gold standard” for infrastructure. This wasn’t casual praise—it was a signal to the market.


Rubin isn’t just another GPU upgrade. Nvidia is pitching it as a fully integrated AI ecosystem that combines compute, networking, and data movement into a single system. If it delivers as promised, Rubin could lock in Nvidia’s dominance as AI development moves into its next phase.


Under the hood, Rubin is built for massive AI workloads. Its GPU is designed for extreme compute, paired with a CPU optimized specifically for AI reasoning and data flow. High-speed chip-to-chip connectivity and advanced networking aim to make the entire system work as one cohesive unit, rather than a collection of separate parts. The result: far fewer GPUs needed to train large models, with sharply reduced costs.


In practical terms, data centers running Rubin should be able to scale AI faster, cheaper, and more efficiently. This isn’t an incremental improvement—it’s a structural shift in how large-scale AI systems are built.


Musk’s endorsement carries weight because he doesn’t hand out compliments easily, especially in areas where he’s competing. With xAI, Musk needs world-class infrastructure, and Rubin appears to fit that need. His comments suggest a recognition that Nvidia’s technology is critical to building autonomous, real-world AI systems at scale.


For investors, the implications are significant. Rubin promises a major drop in inference costs compared to current platforms, potentially improving margins for AI companies or allowing them to undercut competitors on price. Hyperscalers like Google, Amazon, and Microsoft could see meaningful cost efficiencies, while Nvidia further entrenches itself at the center of AI infrastructure.


With Rubin systems expected to roll out later this year, Nvidia is reinforcing its grip on the foundation layer of AI—the layer where long-term economic value is likely to accumulate.

Rubin has Musk’s attention. For many investors, that alone is reason to keep Nvidia firmly on the radar.