The FBI Is Buying Your Location Data Without a Warrant. It Never Stopped.
The AI Surveillance Debate Is Missing the Point While the tech industry has spent weeks debating whether Anthropic or OpenAI would allow their AI to be used for mass surveillance, FBI Director Kash Patel sat before the Senate Intelligence Committee on March 18 and quietly confirmed the practice ...

image from FLUX 2.0 Pro
The AI Surveillance Debate Is Missing the Point
While the tech industry has spent weeks debating whether Anthropic or OpenAI would allow their AI to be used for mass surveillance, FBI Director Kash Patel sat before the Senate Intelligence Committee on March 18 and quietly confirmed the practice is already happening — no AI required.
Under oath and in response to direct questioning from Senator Ron Wyden, Patel acknowledged that the FBI is purchasing commercially available location data on Americans from private data brokers, according to The Guardian. "We do purchase commercially available information that is consistent with the constitution and the laws under the Electronic Communications Privacy Act, and it has led to some valuable intelligence for us," Patel said. He did not commit to stopping. Ars Technica also reported on the confirmation.
The statement marks a reversal from 2023, when then-Director Christopher Wray told the same body that the agency had ceased such purchases. The FBI had quietly restarted them.
Wyden's response was blunt: "Doing that without a warrant is an outrageous end-run around the Fourth Amendment."
The exchange is a useful corrective to how the public conversation around AI and surveillance has been framed. For months, the dominant story has been the standoff between Anthropic and the Department of Defense. Anthropic refused Pentagon demands that it allow "any lawful use" of its Claude AI, citing two specific red lines: no domestic mass surveillance and no lethal autonomous weapons, The Guardian reported. Bloomberg and CNBC, citing a senior DoD official, reported that the Pentagon formally informed Anthropic leadership on March 5 that the company and its products are deemed a supply chain risk — effective immediately. Anthropic filed two federal lawsuits challenging the designation as unconstitutional retaliation for protected speech. (A hearing is scheduled for March 24 in California federal court.)
In the aftermath, OpenAI signed a contract with the Pentagon, The Verge reported. Sam Altman framed the deal publicly as consistent with the company's red lines on surveillance and autonomous weapons. But Miles Brundage, OpenAI's former head of policy research, posted on X that "in light of what external lawyers and the Pentagon are saying, OpenAI employees' default assumption here should unfortunately be that OpenAI caved + framed it as not caving, and screwed Anthropic while framing it as helping them." The prohibition on surveillance in the deal applies only to actions that are "intentionally" and "deliberately" targeted at US persons — language that legal experts have noted historically gives the government wide latitude when operations are framed as incidental or bulk collection.
But the deeper problem predates both companies' contracts. Under current US law, the government cannot compel telecom companies to hand over cell phone location records without a warrant — the Supreme Court established that in its 2018 Carpenter v. United States decision. What the court did not address is whether the government can simply buy that same data from private data brokers who aggregate it from apps, browsers, and other online sources. The answer, practically speaking, is yes.
The data broker industry is worth hundreds of billions of dollars globally. Its product is detailed behavioral profiles assembled from location pings, browsing histories, purchasing records, and social network activity. This information is sold continuously to advertisers, but it is equally available to government agencies. Federal law enforcement does not need a judge, probable cause, or even a specific target. It can — and does — purchase bulk datasets that contain the location histories of millions of Americans.
Dario Amodei addressed this directly in a blog post during the Anthropic-Pentagon dispute: "Under current law, the government can purchase detailed records of Americans' movements, web browsing, and associations from public sources without obtaining a warrant. Powerful AI makes it possible to assemble this scattered, individually innocuous data into a comprehensive picture of any person's life — automatically and at massive scale."
That framing is important. AI does not create the surveillance capability — it accelerates it. The data already exists. The pipelines already exist. What AI adds is the ability to process and cross-reference at a scale and speed that was previously impractical. In 2019, the New York Times demonstrated with a leaked smartphone location dataset how easy it was to track and identify nearly any individual from ostensibly anonymized data, in one case identifying a senior Defense Department official and his wife from their daily movements. That was before modern large language models existed.
The ICE mass deportation campaign is one documented example of this infrastructure in use. According to reporting by 404 Media, ICE used commercially available location data to monitor entire neighborhoods and track people to their homes and workplaces based on phone location signals. In 2024, a company used similar data to log nearly 600 visits to Planned Parenthood locations for a targeted anti-abortion advertising campaign.
Legislation to close the data broker loophole exists but has not moved. The bipartisan Government Surveillance Reform Act — sponsored in the Senate by Wyden and Senator Mike Lee, and in the House by Representatives Warren Davidson and Zoe Lofgren — would prohibit the federal government from purchasing location data, web browsing history, and communications content on Americans without a warrant or a FISA court order. Davidson said on social media after Patel's testimony that the practice is "a clear violation of the fourth amendment and is why I introduced the Government Surveillance Reform Act."
The bill has not passed.
For founders and investors watching the AI and government contracting space: the question of whether AI companies hold the line on surveillance language in their DoD contracts matters, but it is downstream of a more fundamental unresolved legal question. The government's mass surveillance infrastructure does not require AI cooperation because it does not require warrants. It requires data brokers to keep selling, and Congress to keep not acting.
Amodei was right that AI raises the stakes on this problem. He was also right that the judicial interpretation of the Fourth Amendment has not caught up. What neither the Anthropic standoff nor the OpenAI contract language resolves is whether that gap ever gets closed — and Patel's testimony this week suggests the executive branch is not waiting around for the answer.

