Google Erased Its Own AI Weapons Ban Eighteen Months Before Signing the Pentagon Deal
Google erased its own weapons and surveillance ban from its AI principles in February 2025 CNBC. Eighteen months later, it signed a classified deal letting the Pentagon use its AI models for "any lawful governmental purpose" The Information.
More than 580 DeepMind researchers sent a letter to CEO Sundar Pichai on April 27 urging him to refuse classified Pentagon AI work — one day before Google signed the deal Washington Post. They found out it was done from Bloomberg, in group chats. Kent Walker confirmed the agreement in an internal newsletter, not an all-hands. Neither he nor Pichai addressed employees directly Transformer News.
The researchers' objection was not abstract. The contract's opening language immediately prompted expert analysis that its restrictions were written in "should not" rather than "shall not" — the difference between a suggestion and a legal obligation the Pentagon must follow.
"Shall not" imposes an enforceable obligation on the Pentagon. "Should not" does not, according to Charlie Bullock, a senior researcher at the Institute for Law and AI Transformer News. The researchers had written, one day before the deal closed, that the only way to guarantee Google did not become associated with such harms was to reject any classified workloads — "otherwise, such uses may occur without our knowledge or the power to stop them" Business Insider.
By February 2025, Google had already removed the passage in its own AI principles that pledged the company would not "use AI for weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people," or "technologies that gather or use information for surveillance violating internationally accepted norms" CNBC. The new principles, co-authored by Demis Hassabis, DeepMind's CEO, and James Manyika, its senior vice president of research, cited "a global competition taking place for AI leadership." The classified deal was the outcome of that decision, not a departure from it.
The contract Google signed does not, as one legal observer noted, "confer any right to control or veto lawful Government operational decision-making" Transformer News. Google agreed to the term "should not" rather than "shall not." The Pentagon's fiscal 2027 budget request, submitted in April, asks for $54.6 billion for the Defence Autonomous Warfare Group — a 24,000 percent increase over the prior year's request, within a defense budget grown 42 percent year over year to $1.5 trillion TNW. Jeff Dean, Google's chief scientist, publicly criticized mass surveillance in February. "Mass surveillance violates the Fourth Amendment and has a chilling effect on freedom of expression," he wrote on X Transformer News. He was right about the surveillance. He then approved the deployment of Gemini to 3 million Pentagon personnel TNW. When the deal became public, Dean and Pichai did not respond to employees directly. Both tweeted about Google Translate's 20th anniversary the same day.
Google's researchers tried to stop it before it happened. "Currently, the only way to guarantee that Google does not become associated with such harms is to reject any classified workloads," they wrote Business Insider. "Otherwise, such uses may occur without our knowledge or the power to stop them." They were ignored.
In 2018, Google employees who protested the original Maven contract — a Pentagon project to analyze drone surveillance footage — were heard. The company walked away from a contract worth a few million dollars. By 2024, that tolerance had evaporated. Twenty-eight employees who protested Project Nimbus — a $1.2 billion contract supporting Israeli military operations in Gaza and the West Bank — were fired, according to NBC News CNN Business.
The researchers who built Gemini are now working for the Defense Department, in a legal sense, and have no visibility into how the models are being used on classified networks. The Wayback Machine, at least, is clear about what Google promised once — and what it decided was no longer true.