Boss Digital

Big Tech needs to generate $600 billion in annual revenue to justify AI hardware expenditure


The big picture: The tech industry is riding a new high amid a frenzy fueled by AI. Big Tech companies have been plowing huge sums to build out the necessary infrastructure to meet what they perceive demand will be for these products in the coming years. One analyst warns however that the industry needs to stop and consider whether the actual revenue generated by AI will be enough to support these investments.

Analyst at Sequoia Capital, David Cahn, noted last September that there was a very significant gap between the revenue expectations implied by the AI infrastructure build-out and the actual revenue growth in the AI ecosystem. He estimated that the annual AI revenue required to pay for their investments was $200 billion.

Fast forward almost a year – a period during which Nvidia has become the most valuable company in the world – and that number has climbed to $600 billion, annually.

This is how Cahn came to his conclusion. He started with the premise that for every $1 spent on a GPU, roughly $1 needs to be spent on energy costs to run the GPU in a data center. In Q4 2023, Nvidia’s data center run-rate revenue forecast was $50 billion. He took that run-rate revenue forecast and multiplied it by 2x to reflect the total cost of AI data centers.

He determined that the implied data center AI spend was $100 billion. Then he multiplied that number by 2x again to reflect a 50% gross margin for the end-user of the GPU.

The final calculation is $200 billion in lifetime revenue needed to be generated by these GPUs to pay back the upfront capital investment. And this does not include any margin for the cloud vendors, Cahn said – for them to earn a positive return, the total revenue requirement would be even higher.

By Q4 2024, Nvidia’s data center run-rate revenue forecast is predicted to be $150 billion, making its implied data center AI spend $300 billion and the AI revenue required for payback $600 billion.

That is a big hole to fill especially when it is not clear whether the capital expenditure build out is linked to true end-customer demand or is being built in anticipation of future end-customer demand.

Furthermore Cahn is projecting that AI revenue required for payback will eventually reach $100 billion, pointing to Nvidia’s recently announced B100 chip, which will have 2.5x better performance for only 25% more cost. “I expect this will lead to a final surge in demand for Nvidia chips,” says Cahn. “The B100 represents a dramatic cost vs. performance improvement over the H100, and there will likely be yet another supply shortage as everyone tries to get their hands on B100s later this year.”

Ultimately Cahn thinks the expenditures will be worth it in the end. GPU capex is like building railroads, he said, meaning eventually the trains will come, along with the destinations.

Certainly executives from major tech companies have been expressing confidence in AI’s potential to drive revenue growth with Big Tech’s reported revenue growth rates in Q1 much higher than anticipated just over two quarters ago. Microsoft, for example, reported a 7-point increase in AI contributions to Azure’s growth of 31%. That said, this analyst urges the industry to consider who wins and who loses as these investments continue to be made.

“There are always winners during periods of excess infrastructure building,” he said. “Founders and company builders will continue to build in AI – and they will be more likely to succeed, because they will benefit both from lower costs and from learnings accrued during this period of experimentation.”

Meanwhile, if his forecast actually materializes, it will be primarily the investors that are harmed, he said.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top