Silicon Valley Musters Behind-the-Scenes Support for Anthropic
When the Pentagon declared Anthropic a supply chain risk earlier this month, Silicon Valley did something it almost never does: it closed ranks around a competitor. The designation by Defense Secretary Pete Hegseth — which Anthropic is challenging in court — came after contract negotiations brok...

When the Pentagon declared Anthropic a supply chain risk earlier this month, Silicon Valley did something it almost never does: it closed ranks around a competitor.
The designation by Defense Secretary Pete Hegseth — which Anthropic is challenging in court — came after contract negotiations broke down over two red lines the company refused to cross: AI used in autonomous weapons and AI used in mass surveillance of American citizens. The Pentagon wanted unrestricted access to Claude for classified systems. Anthropic said no. Within days, the company was effectively blacklisted from government work.
What happened next was remarkable by any measure of the industry's recent culture wars.
Microsoft and Amazon, both Anthropic investors and direct competitors, said they would continue offering Claude to commercial customers. Major trade associations representing hundreds of firms filed amicus briefs in support of Anthropic in federal court. Nearly 150 retired federal and state judges — appointed by Republicans and Democrats alike — submitted their own brief Tuesday, arguing the Pentagon misinterpreted the law and violated required procedures when it applied a designation historically reserved for foreign adversaries to a domestic company. "Anthropic is asking only that it not be punished on its way out the door," the judges wrote.
Perhaps most striking: employees from OpenAI and Google DeepMind filed their own amicus brief. DeepMind chief scientist Jeff Dean was among the signatories. OpenAI, whose CEO Sam Altman has publicly sided with the Pentagon on the policy questions, is still backing Anthropic's right to litigate. Eight current and former employees from multiple companies told the New York Times that senior executives mobilized private messaging channels and video calls to pressure leadership into speaking up.
This is an industry saying, collectively, that the precedent being set here is more important than any single contract, said one person familiar with the internal discussions, speaking on condition of anonymity because they were not authorized to discuss the matter publicly.
The financial stakes are not abstract. Anthropic CFO Krishna Rao said in a court filing that the company faces hundreds of millions, or even multiple billions, of dollars in lost revenue in 2026 alone. Public sector annual recurring revenue, expected to reach $500 million this year, is now projected to drop by $150 million. The company has spent over $10 billion training and deploying its models; cumulative commercial revenue since 2023 is $5 billion. It is still deeply unprofitable.
The real-world effects are already visible. A financial services customer paused negotiations on a $15 million deal. Two leading financial services firms are refusing to close deals worth a combined $80 million unless they can cancel unilaterally. A grocery store chain canceled a sales meeting. A Fortune 20 company told Anthropic its attorneys were freaked out about the relationship. A drugmaker wants to shorten its contract by 10 months. A fintech client is trying to shave $5 million off a $10 million planned deal, citing the Pentagon situation.
Federal agencies have gone further than the legal requirement. Anthropic's chief commercial officer Paul Smith said one cybersecurity company and one electronics testing company were told by agencies to stop using Claude. Despite acknowledging that there was no legal basis for the directive — only political pressure — the company stated that it had no choice but to comply, Smith wrote.
The Pentagon is not waiting for the legal fight to resolve. Cameron Stanley, the Defense Department's chief digital and AI officer, told Bloomberg the department is actively building its own LLMs in government-owned environments, with engineering work already underway and operational deployment expected very soon. OpenAI has signed its own agreement to deploy in classified networks. xAI's Grok received certification for classified systems days after the designation. Anduril founder Palmer Luckey said the Pentagon could have been more forceful against Anthropic — the defense startup is well-positioned to benefit from the vacuum.
Sam Altman framed the dispute in a staff memo reviewed by The Washington Post: not about AI capabilities, but about control. OpenAI's own agreement with the Pentagon includes carve-outs for domestic surveillance and autonomous offensive weapons — carve-outs Anthropic refused to drop. The country, Altman wrote, absolutely needs help with AI for defense if we want to continue to enjoy peace and prosperity. But he also acknowledged the dispute was an issue for the whole industry.
That is the part that matters. The safety principles Anthropic is fighting for — human control over weapons, no mass surveillance of citizens — are not fringe positions. They are shared by its competitors. Google DeepMind and OpenAI's employees both filed briefs saying so. The disagreement is about who decides: the company or the government?
A preliminary injunction hearing is scheduled for next Tuesday. If the judges' brief is right about the statutory violation, the Pentagon's case is legally weak regardless of the policy merits. If Anthropic loses, the precedent will define how every AI company in America negotiates with the Defense Department going forward — and every tech company with any government exposure will be watching.
Notebook: The designation was made while Anthropic's team was still on a call with Pentagon chief technology officer Emil Michael, who was proposing a compromise on bulk data analysis, according to a person familiar with the talks. Hegseth's announcement went out on X at the same moment. That timing is either remarkable coincidence or deliberate humiliation — worth noting for the record.
Sources:

