OpenAI sent a memo to investors this week with the subject line: "Compute is the ballgame." The memo, reviewed by Key Context and confirmed by Bloomberg, lays out a detailed gigawatt-by-gigawatt comparison showing OpenAI's compute capacity ramp outpacing Anthropic's. The numbers are real. The question is what they actually prove.
OpenAI says it has already identified over 8 GW of capacity and is targeting 30 GW by 2030. Its own historical ramp: 0.2 GW in 2023, 0.6 GW in 2024, 1.9 GW in 2025. Against that, OpenAI estimates Anthropic at 1.4 GW for 2025, 3 to 4 GW by end of 2026, and 7 to 8 GW by end of 2027. "Even at the high end of that range, our ramp is materially ahead and widening," the memo states. "That gap matters because compute is now a product constraint."
The framing is tidy. It also leaves out the part where the numbers become hardest to believe.
The structural anchor is the cost math. Jensen Huang has said building 1 GW of AI data center capacity costs $50 to $60 billion, with roughly $35 billion of that flowing to Nvidia for chips and systems. Reuters Breakingviews recently put the floor at $60 billion per GW. OpenAI's 30 GW target implies $1.5 to $1.8 trillion in infrastructure spending. The memo does not explain how that gets financed, beyond suggesting that scale will pay for itself. That is a projection, not a current-state scorecard.
Anthropic's counterargument is not in the memo. Anthropic runs a meaningfully different infrastructure strategy: Google TPUs, AWS Trainium2 chips, and Broadcom-designed silicon. This is not a laggard catching up. It is a company that has deliberately avoided single-source dependency on Nvidia, betting it can close the efficiency gap through silicon diversity rather than raw capacity. Whether that bet is right is genuinely contested. But it is the obvious hole in OpenAI's "we're winning" presentation to investors.
Both companies are now simultaneously competing for the same capital and preparing public offerings. Anthropic's Mythos model, announced this week, is being tested by Apple and Amazon under restricted access because Anthropic considers it too effective at finding vulnerabilities in major operating systems and browsers to release broadly. Anthropic says its run-rate revenue has surpassed $30 billion, up from $9 billion at the end of 2025. The revenue trajectory is real. So is the IPO consideration.
OpenAI's memo is an investor document. Its job is to make the bull case. The bull case depends on Stargate closing on schedule, on the capital markets remaining willing to fund $60 billion per GW, and on the premise that compute capacity will translate directly into product advantage rather than just being table stakes in a race where everyone eventually has to buy the same chips from the same company. These are reasonable bets. They are also the kind of bets where the spreadsheet starts looking very different if interest rates move, if hyperscaler debt becomes politically unpopular, or if Anthropic's efficiency approach closes faster than the memo assumes.
What the memo actually shows is that the race is real, the numbers are large, and both companies believe compute scale is the determining factor. Whether that belief survives contact with actual capital markets is the next question.
Sources: Key Context | Bloomberg | SiliconAngle/Nvidia $50-60B per GW | Reuters Breakingviews, April 2026