Beyond The Hype - Looking Past Management & Wall Street Hype

Beyond The Hype - Looking Past Management & Wall Street Hype

Share this post

Beyond The Hype - Looking Past Management & Wall Street Hype
Beyond The Hype - Looking Past Management & Wall Street Hype
What Will Nvidia And AMD Data Center GPU Ramp Look Like?

What Will Nvidia And AMD Data Center GPU Ramp Look Like?

Beyond The Hype's avatar
Beyond The Hype
Oct 10, 2023
∙ Paid

Share this post

Beyond The Hype - Looking Past Management & Wall Street Hype
Beyond The Hype - Looking Past Management & Wall Street Hype
What Will Nvidia And AMD Data Center GPU Ramp Look Like?
Share

This article was originally published on 8/16/2023.

Summary

  • Traditional consumers of GPUs, such as CSPs, showed minimal to negative AI capex growth, while AI startups emerged as big GPU purchasers. Implications.

  • Training is compute-intensive and requires high-precision math operations, while inference is less compute-intensive and uses lower precision number formats. Implications.

  • Inference hardware is likely to move to the edge and client devices, while training hardware will continue to be used in data centers. Implications.

  • Demand and Capacity landscape until 2025.

Note: This article uses GPU as a general term to refer to GPUs, TPUs, and other compute accelerators.

This earnings season, we heard from several players including Intel (INTC), Advanced Micro Devices (AMD), TSMC (TSM), Microsoft (MSFT), Alphabet (GOOG) (GOOGL), Amazon (AMZN), Meta (META), Oracle (ORCL), and Arista Networks (ANET) about the AI ramp that is underway. Interestingly, not a single player provided meaningful clarity on the type of ramp, the size of the ramp, or the timing of the ramp. Thankfully, we have enough diverse data points we can work from to develop a picture of the ramp – even if it is a hazy one.

Established CSPs Vs AI Startups

A surprising aspect of this earnings season was that the AI capex growth at the traditional consumers of GPUS – the CSPs - was minimal to negative. To some extent, CSPs like Microsoft, Meta, AWS, Oracle, and Google are deploying GPUs by cutting down the CPU spend, but the extent of GPU deployment still seems small – especially when measured in reference to the GPUs that Nvidia (NVDA) is selling and is projected to sell. So, a question arises as to who is buying these GPUs that Nvidia is selling. The answer seems to be that most of the GPUs are going to non-traditional players.

While there is widespread demand from the development community for small single-unit to 4-unit or 8-unit GPU setups, there are also a handful of new names that are popping up as big GPU purchasers – these are all AI startups that are either developing new ML models or trying to provide infrastructure for AI applications. The names that spring up in this context include Open AI, CoreWeave, Inflection AI, Lambda, Anthropic, Cohere, Stability AI, and X.AI.

Before continuing to read the article, please read Nvidia H100 GPUs: Supply and Demand. This GPU Utils article has consolidated many bits and rumors around the industry about the GPU supply-demand situation. While not entirely accurate, it provides a good summary of the current situation.

(Side note: Many of the up-and-coming infrastructure providers are not likely to survive the bubble phase, a few will get acquired, and even fewer will become stand-alone companies)

Keep reading with a 7-day free trial

Subscribe to Beyond The Hype - Looking Past Management & Wall Street Hype to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Beyond The Hype
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share