“Microsoft Swallows OpenAI’s Core Team § Compute Is King”, Dylan Patel, Daniel Nishball2023-11-20 (, , , )⁠:

[background: realpolitik] Microsoft had previously placed huge bets on OpenAI, with plans for >$50b annual datacenter spend to race to AGI and deploy their GPT-4 based Github Copilot products. Our data shows one of OpenAI’s next training supercomputers in Arizona was going to have more than 75,000+ GPUs in a singular site by the middle of next year.

Our data also shows us that Microsoft is directly buying >400,000 GPUs next year for both training and Copilot/API inference. Furthermore, Microsoft also has tens of thousands of GPUs coming in via cloud deals with CoreWeave, Lambda Labs, and Oracle Corporation.

There are a few big question marks on what OpenAI has guaranteed. Most of the Microsoft investment in OpenAI is in the form of compute credits. While there are agreements on the sizes of supercomputers that must be delivered, we believe Microsoft was on track to blow way past those goals and deliver OpenAI more than legally required, meaning a rebalancing is possible.

Microsoft can likely claw back or not deliver quite a bit of what it had planned for OpenAI. These compute resources can be routed to the new internal team. Furthermore, given how killer Microsoft’s legal team is, it’s possible that an even large portion of what is already delivered or soon to be delivered can be clawed back.

If the new team were to spin out and make their own startup, they would have had tremendous difficulty acquiring enough compute to build a GPT-5 scale model before Anthropic or Google. Given there is a sort of runaway escape velocity here, this would put them at a huge disadvantage in the race to AGI. By joining Microsoft, the former OpenAI team will still have access to the necessary compute resources next year.

It is very likely that this development accelerates spending further and Microsoft’s orders for GPUs will have to go up yet again in order to fulfill the OpenAI contract and give the new company everything they need to build GPT-5 next year.