We're still early in the AI buildout
Jensen emphasized that AI is still in its early stages, making the case that the industry is significantly underestimating AI’s computational requirements. He pointed to the acceleration of scaling laws and the rise of agentic AI, which is driving the adoption of reasoning models.
Scaling Laws:
Scaling laws refer to the observation that AI performance improves in a smooth, log-linear fashion—meaning more data, larger models, and increased compute lead to better results in a fairly predictable way. While Jensen didn’t provide new data to support that scaling laws continue to hold, the fact that he reiterated this belief is noteworthy. After the “DeepSeek scare” in late January, OpenAI’s Sam Altman and the hyperscalers echoed similar views, reinforcing confidence in scaling laws—consistent with Jensen’s stance.
Reasoning Models:
Jensen reiterated the point he made on the January earnings call: that we are transitioning from an era of simple, one-shot prompts—where users receive a quick answer in seconds—to an era of reasoning models that simulate cognitive processes. In these models, compute requirements increase exponentially.
He suggested that next-generation models could require hundreds, thousands, or even millions of times more compute than today’s simple prompts. That raises an important question: is this realistic?
I believe the answer is yes.
In the future, we will likely need 100× more compute to support reasoning models. One simple way to frame this is by measuring how long a model “thinks” as a proxy for how much compute it requires. From my own experience, a low-level reasoning prompt can take 2–10 minutes to process. Using a midpoint of 5 minutes implies 300 seconds of compute time—150–300× longer than a one-shot prompt. This back-of-the-envelope estimate aligns with Jensen’s bullish outlook.
The Big Picture:
The combination of scaling laws and the emergence of reasoning models lays the groundwork for a significant expansion in datacenter buildouts over the next few years. At GTC, Jensen reminded us:
“I’ve said before that I expect data center build-out to reach $1 trillion. And, I am fairly certain we’re going to reach that very soon.”
In 2025, we estimate the build-out will reach $485B. I consider Jensen’s “very soon” to be 2028.
Investor Sentiment:
Despite Jensen’s argument that we’re still early in the AI cycle, investor skepticism remains, underscored by the fact that Nvidia trades at 20× estimated CY2026 EPS, with expected 30% earnings growth next year. This view will be tested in the coming weeks, as investor focus shifts to late April—when hyperscalers like Google, Amazon, and Microsoft report earnings. Updated CapEx expectations will be the pressure point for both Nvidia demand and the broader AI trade.