Google's $20 billion cloud quarter puts pressure on AI labs that rent the stack
Google just gave enterprise buyers a harder question than which model is smartest. If one vendor owns the model, the cloud service that sells it, and the chips underneath both, and that setup is now driving real customer spending, then the AI race looks less like a benchmark fight and more like a fight over who can deliver enough working capacity. That is the pressure inside Google's latest earnings remarks and the way Gizmodo framed the result as a fresh shot at OpenAI.
According to Google, Cloud revenue grew 63 percent year over year to more than $20 billion in the first quarter, while revenue from products built on its generative AI models grew nearly 800 percent from a year earlier. Sundar Pichai, chief executive of Google and Alphabet, also said on the earnings call, as quoted by CNBC, that enterprise AI solutions became Google Cloud's primary growth driver for the first time. That matters more than another Gemini benchmark chart because it suggests Google's integrated stack is showing up in paying customer demand.
Google's other numbers point in the same direction. Google said its cloud backlog, meaning signed business it has not yet fully delivered, nearly doubled quarter over quarter to more than $460 billion, according to the same earnings remarks. It also said direct customer use of its first-party models rose to more than 16 billion tokens per minute from 10 billion last quarter, and that Gemini Enterprise paid monthly active users grew 40 percent quarter over quarter. Those are still company-reported numbers, not an independent audit.
The catch is narrower than the broad compute-doom version now floating around AI finance chatter. Pichai said Google is "compute constrained in the near term" and that cloud revenue would have been higher if the company had been able to meet demand, according to CNBC. TechCrunch reported that Google expects to work through about half that backlog over the next 24 months. That does not prove Google cannot satisfy all AI demand in some general sense. It does show that even Google's vertically integrated pitch comes with a near-term delivery limit.
That matters well beyond Google's quarter. OpenAI and Anthropic have spent the past year arguing, in different ways, that model quality and developer adoption will pull enterprise spend their way. Google is now making a blunter claim. Owning the model and the silicon is not just a research advantage. It can become a revenue engine inside cloud. If that holds, labs that rent more of their stack look more exposed, because part of the sales pitch shifts from model quality alone to whether the vendor can deliver the infrastructure underneath it.
Google also used the call to show why this is not a clean victory lap. The company said it has cut the cost of core AI responses by more than 30 percent since moving AI Overviews and AI Mode to Gemini 3, according to Google's remarks. Yet it still raised its 2026 capital expenditure guidance to $180 billion to $190 billion, and expects 2027 capex to increase significantly, CNBC reported. Better models and lower inference costs help, but they do not erase the need to build an enormous amount of capacity just to keep up.
So the useful read on this quarter is not that Google beat OpenAI, or that Gemini finally won an argument on social media. It is that Google put unusually concrete numbers behind a theory the industry keeps hinting at: in enterprise AI, owning more of the stack can turn into demand faster than a model-only story can. The next thing to watch is whether Google can turn that backlog into delivered usage fast enough to prove its stack advantage is durable, not just well instrumented.