Odd Lots

Here Are the Actual Mechanics Behind Powering AI

How to build the ultimate GPU cloud to power AI programs

A Nvidia Corp. HGX H100 artificial intelligence supercomputing graphics processing unit (GPU) at the showroom of the company's offices in Taipei, Taiwan.

Photographer: I-Hwa Cheng/Bloomberg
Lock
This article is for subscribers only.

Artificial Intelligence is all the rage right now and most of the investor excitement has so far been focused on the companies providing the hardware and computing power to actually run this new technology. So how does it all work and what does it actually take to run these complex models? On this episode, we speak with Brannin McBee, co-founder of CoreWeave, which provides cloud computing services based on GPUs, the type of chips pioneered by Nvidia and which have now become immensely popular for generative AI. He walks us through the infrastructure involved in powering AI, how difficult it is to get chips right now, who has them, and how the landscape might change in the future. This transcript has been lightly edited for clarity.