Nvidia (NASDAQ:NVDA) and OpenAI have been central players in the AI boom, and their latest move shows just how closely their futures are becoming linked. Earlier this week, the two companies signed a letter of intent under which OpenAI will deploy at least 10 GW of Nvidia systems to power training and inference for its next-generation models.
Elevate Your Investing Strategy:
- Take advantage of TipRanks Premium at 50% off! Unlock powerful investing tools, advanced data, and expert analyst insights to help you invest with confidence.
Nvidia isn’t just supplying the technology – it’s also putting real money behind the partnership. The company plans to invest up to $100 billion in OpenAI, with funds released in stages as each new block of capacity comes online.
According to D.A. Davidson’s Gil Luria, an analyst ranked amongst the top 5% of Street stock experts, his discussions with frontier lab researchers and engineers suggest that training a current frontier AI model already demands about 1 GW of compute, and that requirement could rise to around 5 GW for a single model by the end of the decade.
“We believe this announcement by both OpenAI and Nvidia is yet another example of OpenAI securing as much compute capacity as physically possible, as they will require these levels of capacity if not greater by 2030 in order to build future models and support growing inference demand,” said the 5-star analyst.
Deploying 1 GW of compute in 2026 would correspond to roughly 1,667 VR200 NVL144 systems or about 240,000 GPUs. This estimate is based on Luria assuming OpenAI uses the standard VR200 NVL144 system (~600 kW power draw) rather than a mix that includes the recently announced VR200 NVL144 CPX, which is optimized for the pre-fill stage of inference. This setup should deliver around 6 zettaflops of FP4 compute, equivalent to approximately 430 million tokens per second for inference tasks.
“Regardless,” says Luria, “this announcement of 10GW of capacity will bring more than 2M GPUs online, though the exact timing of the remaining 9GW being deployed is yet to be determined.”
But considering the scale of capital required, Luria is concerned Nvidia has become the “investor of last resort,” bearing the weight of OpenAI’s oversized commitments.
Nvidia has already stepped in to support CoreWeave’s IPO, and while that move has paid off, tripling the investment’s value, Luria would like to see more traditional investors participate in funding the extensive data center expansion. OpenAI has been committing resources well beyond its capacity, effectively double, triple, or even quadruple booking compute. Although Microsoft retains the right of first refusal on all of OpenAI’s compute, the company has also made over $300 billion in commitments to Oracle, $100 billion for “backup servers from cloud providers,” $25 billion to CoreWeave, plus additional “unquantified commitments” in Europe and the Gulf. These commitments go far beyond the current 10 GW announcement. OpenAI continues to lose tens of billions annually on roughly $12 billion in revenue and, even by its own optimistic projections, is expected to reach only $200 billion in revenue by 2030.
Still, Luria remains constructive on Nvidia. He reaffirmed his Buy rating with a $210 price target, implying a ~19% upside from current levels. (To watch Luria’s track record, click here)
Overall, the Street boasts plenty of NVDA bulls – 37, in total – while an additional 2 Holds and 1 Sell can’t detract from a Strong Buy consensus rating. The average price target is only slightly higher than Luria’s; at $212, the figure makes room for 12-month returns of ~20%. (See NVDA stock forecast)
To find good ideas for stocks trading at attractive valuations, visit TipRanks’ Best Stocks to Buy, a tool that unites all of TipRanks’ equity insights.
Disclaimer: The opinions expressed in this article are solely those of the featured analyst. The content is intended to be used for informational purposes only. It is very important to do your own analysis before making any investment.