Anthropic just signed the biggest compute deal in AI history. The real story is the power grid underneath it.
Anthropic signed more compute infrastructure in a single day than most AI labs secure in a year.
On May 6, the company announced a cascade of commitments: up to 5 gigawatts from Amazon with nearly 1 gigawatt online by end of 2026; another 5 gigawatts from Google and Broadcom coming in 2027; $30 billion in Azure capacity from Microsoft and NVIDIA; and a $50 billion American AI infrastructure investment with Fluidstack, according to the company's blog post. The deals were announced together and represent the most aggressive multi-partner compute buildout any single AI lab has disclosed at once.
That physical infrastructure is why Anthropic also removed usage limits for its paying subscribers that same day. It doubled five-hour rate limits for Pro, Max, Team, and seat-based Enterprise plans, eliminated peak-hour caps for Pro and Max accounts, and raised API limits for Claude Opus models, though Anthropic described the Opus increases as substantial without specifying exact figures. The subscriber relief is real and immediate. It is also downstream of the power buildout.
Anthropic's API traffic grew 17 times year-on-year, and its run-rate revenue crossed $30 billion, according to a live blog of the event. That kind of growth has a physical limit: you cannot run that much inference on borrowed cloud capacity. The company needed its own.
The competitive context makes the power grab comprehensible. A Menlo Ventures survey published in December 2025 found Anthropic capturing roughly 40 percent of US enterprise AI spending, while OpenAI has fallen to around 27 percent. This is a finding that tracks with the trajectory the company described at its developer day. The pressure on that share is concrete: Mercado Libre, the Latin American e-commerce and fintech company with 23,000 engineers, is aiming for 90 percent autonomous coding across its engineering workforce by the third quarter of this year. That math is direct: 23,000 engineers times 90 percent equals roughly 20,700 coders. Even at a conservative one-seat-per-coder ratio, which is low given that AI coding tools typically require a seat license, one company's autonomous coding ambition implies a seat count well into the hundreds of thousands. That demand is not theoretical. It is a purchase order.
The implication is straightforward: enterprise AI demand is no longer bounded by what models can do. It is bounded by where the computers are. CEO Dario Amodei described the compute constraint at the San Francisco event: that 80-fold annualized growth in the first quarter is the reason the company has struggled to keep up with demand, he told CNBC. What gives Anthropic the moat is not any single deal but the cumulative effect. Guaranteed capacity across four cloud partners plus a dedicated data center, spread across different geographies and silicon, means Anthropic can offer enterprise customers something competitors cannot match at this scale. The infrastructure commitments suggest that the real moat in frontier AI is now measured in gigawatts, not in benchmark scores.
Ryan Mallory, CEO of data center operator Flexential, put the dynamic plainly: serious companies are discussing compute capacity in space because the market is searching aggressively for power and scale. Anthropic has said it is exploring a partnership with SpaceX on orbital AI compute, multiple gigawatts of capacity in theory, though no independent sources have confirmed the plan. The SpaceX deal itself, more than 300 megawatts and 220,000 NVIDIA GPUs at the Colossus 1 data center in Memphis, was reported separately by Reuters and represents the more politically charged element of the announcements, given Elon Musk's public reversal from calling Anthropic misanthropic in February to calling its team highly competent and its work aligned with humanity's interests.
What to watch next is whether Anthropic can build out the physical infrastructure fast enough to meet the demand it has created. Whether the multi-partner compute strategy pressures OpenAI's exclusive Azure partnership as enterprise buyers increasingly shop for capacity across multiple providers remains an open question. Analyst commentary on that dynamic has not yet been published.