The Court Was Not Measuring Merit. It Was Measuring Exhaustion.
Courts assumed their complexity was filtering out bad cases. It turns out it was just filtering out people without time to burn.

Courts were gatekeeping by exhaustion, not merit. AI just removed the exhaustion.
That is the plain reading of a wave of data hitting the legal system right now. More than 16 percent of federal employment lawsuits filed in 2025 were pro se — filed without a lawyer — up from under 10 percent in 2021, according to Lex Machina, an independent legal analytics firm. Employment cases brought without attorneys surged 49 percent year over year. Fair Housing Act pro se filings jumped 69 percent through the first nine months of 2025 compared to the prior year, per Seyfarth Shaw. Appellate filings by people without counsel now represent half of all new federal cases.
The numbers are climbing. But the more urgent signal is in what courts are doing about it.
Fifty-two court decisions flagged improper AI use in February 2026, according to the AI Hallucination Cases Database maintained by attorney Damien Charlotin. That is twenty-six times the two decisions recorded in February 2025. The trajectory is not linear — it is a cliff.
"We are seeing a significant increase in pro se cases where AI appears to have been used to draft complaints and other filings," Roy Strom reported for Bloomberg Law, citing multiple employment and housing litigators. One case involved a pro se plaintiff who submitted a thirty-page complaint dense with fabricated case citations, including a fictional Supreme Court opinion. The plaintiff, who had likely used an AI tool to generate the filing, appears to have believed the output was real. This is the failure mode that courts are scrambling to catch: not bad legal arguments, but confident ones built on case law that does not exist.
Charlotin's database now contains enough entries that patterns are visible. Courts are responding with sanctions, dismissals, and orders requiring parties to certify that AI-generated filings are accurate. But the volume is outrunning the response.
Kristin White, a partner at Fisher Phillips, told Bloomberg Law that every litigator in her office handling employment cases now has at least one file where AI use by the opposing pro se party is suspected. "Every litigator" is doing a lot of work in that sentence. The firm published insights in March flagging the surge as a structural shift, not a temporary artifact.
The cost picture adds another dimension. Defending AI-assisted pro se cases costs 10 to 15 percent more to resolve, White said. The premium comes from settlement dynamics — AI-assisted plaintiffs tend to file more motions and hold out for larger demands — and from the discovery complications when a party cannot explain their own filings because an AI generated them.
What is breaking here is not access to justice. It may be the opposite. The legal system spent decades building a process that required time, money, and procedural knowledge to navigate. That process filtered out people who could not afford to persist. It turns out persistence and merit are not the same thing.
"Every system that was regulated, either explicitly or implicitly, by the fact that they were effortful for humans — letters of recommendation, lawsuits, government filings, essays — will break," said Ethan Mollick, a professor at the Wharton School of the University of Pennsylvania, in a post on social media this week. Mollick has been tracking AI's effect on effort-gated systems for two years. The legal system is now a live case study in that prediction.
The 16 percent pro se rate is not a crisis of access — it is a revelation. Courts were not measuring whether someone had a legitimate grievance. They were measuring whether someone could survive the process long enough to file it. AI removed the survival requirement. What remains is the question courts never had to answer: was the gate always about the person, or was it about the maze?
The answer will shape how courts respond, how legislatures fund them, and whether the legal profession adjusts to a world where anyone can file anything. The 26x surge in AI-improper-use decisions is not a blip. It is the legal system learning what it just let in.


