Patient Privacy Rules Created a Pipeline to Mass Tort Lawyers
On March 12, a small telehealth company called GuardDog agreed to a permanent ban from federal health data networks and promised to delete every patient record it had obtained.

image from FLUX 2.0 Pro
On March 12, a small telehealth company called GuardDog agreed to a permanent ban from federal health data networks and promised to delete every patient record it had obtained. In exchange, it paid nothing. No fines. No damages. Just a consent judgment filed quietly in the Central District of California.
The records GuardDog accessed — more than 300,000 of them, including genetic information, mental health data, and reproductive health records — had been funneled to mass tort attorneys recruiting plaintiffs for class action lawsuits. The patients whose data was harvested were never informed. And according to the <a href="https://www.courthousenews.com/wp-content/uploads/2026/01/epic-systems-vs-health-gorilla-complaint.pdf">complaint filed by Epic Systems</a>, which uncovered the scheme, shell companies operating through the same networks inserted fabricated clinical data into real patient records to create the illusion of treatment.
GuardDog admitted as much in the stipulated judgment. Its goal, the company said, had been "to provide chronic care management and remote patient monitoring for patients." That <a href="https://www.epic.com/epic/post/defendant-guarddog-telehealth-admits-to-providing-patient-records-to-law-firms/">did not happen</a>. "For the duration of its existence, its business instead focused on requesting, reviewing, and summarizing medical records, and providing those medical records to law firms."
"The scheme thus operates like a Hydra," <a href="https://www.courthousenews.com/wp-content/uploads/2026/01/epic-systems-vs-health-gorilla-complaint.pdf">Epic wrote in its complaint</a>. "When one fraudulent entity is exposed, the bad actors birth a new one."
<h2>The case</h2>
The lawsuit — <a href="https://www.courtlistener.com/docket/72135068/epic-systems-corporation-v-health-gorilla-inc/">Epic Systems Corp v. Health Gorilla Inc, No. 2:26-cv-00321</a> — names Health Gorilla, a designated Qualified Health Information Network (QHIN) under the federal TEFCA framework, as the central conduit. Epic alleges that entities including Mammoth Path Solution, RavillaMed, LlamaLab, and GuardDog used fictitious websites, shell entities, and sham NPI numbers to pose as healthcare providers, accessing records through Health Gorilla's connections to Carequality, which links 600,000 care providers, 50,000 clinics, and 4,200 hospitals.
GuardDog's attorney Elizabeth Mitchell <a href="https://www.reuters.com/legal/government/telehealth-company-admits-providing-medical-records-lawyers-epic-lawsuit-2026-03-16/">told Reuters</a> the company "has always maintained that it acted in good faith, with the goal of supporting patient care to the best of its abilities, whether its patients were involved with the justice system or not." GuardDog nonetheless admitted to providing records to law firms as part of the consent judgment.
<h2>The regulatory backdrop</h2>
The fraud exploited real infrastructure built for a legitimate purpose. TEFCA and Carequality were designed to end the balkanization of patient data across thousands of providers. The system assumes that entities requesting records through authorized channels are who they claim to be. GuardDog showed that assumption is exploitable when identity verification is thin.
This lands in a genuine regulatory tension. Under the 21st Century Cures Act's information blocking rules, healthcare providers cannot refuse to share patient records when requested through authorized channels. Providers who block access face loss of 75 percent of annual market basket increases.
For health IT developers and health information networks, the penalties are sharper. Under the <a href="https://www.federalregister.gov/documents/2023/07/03/2023-13851/grants-contracts-and-other-agreements-fraud-and-abuse-information-blocking-office-of-inspector">HHS Office of Inspector General's final rule</a>, published July 3, 2023, OIG may impose civil monetary penalties of up to $1 million per violation on certified health IT developers and health information networks or exchanges that commit information blocking. The rule took effect September 1, 2023.
But if a provider shares records with a fraudulent requestor, they risk violating HIPAA. There is no safe harbor for good-faith compliance with an information blocking rule that results in an improper disclosure.
HHS enforcement has escalated on both fronts. In September 2025, HHS Secretary Robert F. Kennedy Jr. <a href="https://www.hhs.gov/press-room/hhs-crackdown-health-data-blocking.html">directed department resources</a> toward active information blocking enforcement. According to that HHS announcement, ASTP/ONC and OIG are leading the enforcement push, and Assistant Secretary for Technology Policy Thomas Keane said the agency had begun reviewing reports and providing technical assistance to OIG. By February 2026, according to <a href="https://www.hklaw.com/en/insights/publications/2026/02/the-wait-is-over-information-blocking-enforcement-is-officially-here">Holland & Knight's reporting</a> from the ASTP 2026 Annual Meeting, ASTP had begun issuing notices of investigation of potential nonconformity to health IT developers — the first category of actors to face formal scrutiny. Nearly 1,600 complaints had been submitted to the Information Blocking Complaint Portal as of that month. Meanwhile, the HTI-5 proposed rule, whose comment period closed February 27, may remove the TEFCA exception for information blocking entirely.
Providers are caught between two mandates. Share and risk HIPAA liability if the requestor is fraudulent. Refuse and risk information blocking penalties. The GuardDog case shows this tension is not theoretical — bad actors are already exploiting it.
<h2>Epic's dual role</h2>
Credit where it's due: Epic detected real fraud. The company's analysts identified patterns in record access that didn't match legitimate clinical use and traced them back to shell entities. Without Epic's investigation, the scheme might still be running.
But Epic is not a disinterested party. The company controls the majority of the U.S. hospital EHR market, and it is currently defending itself on multiple fronts.
Particle Health's antitrust lawsuit, filed in 2024, alleges Epic disconnected Particle's customers on a "massive scale" to block competition in health data exchange. A judge <a href="https://www.reuters.com/">denied Epic's motion for full dismissal</a> in September 2025, and the case remains active.
Texas Attorney General Ken Paxton <a href="https://www.texasattorneygeneral.gov/">sued Epic in December 2025</a>, alleging anticompetitive practices including restricting parental access to children's medical records.
And in March 2026 — after the GuardDog admission — three new class action lawsuits were filed by patients alleging Epic was negligent in failing to prevent the breach in the first place, <a href="https://www.bankinfosecurity.com/telehealth-firm-to-be-barred-from-data-exchanges-a-31052">according to BankInfoSecurity</a>.
Health Gorilla CEO Bob Watson has framed Epic's litigation as monopolistic aggression. In a <a href="https://www.prnewswire.com/news-releases/health-gorilla-releases-statement-in-response-to-epic-lawsuit-302670823.html">January statement</a>, Watson said Epic "has done the equivalent of shouting 'fire' in the middle of a crowded theater" and called for "fair participation, without exclusionary barriers imposed by dominant incumbents that wish to monetize clinical data exchanges for their own benefit." Health Gorilla is represented by Quinn Emanuel.
Health Gorilla called the GuardDog consent judgment something that <a href="https://www.bankinfosecurity.com/telehealth-firm-to-be-barred-from-data-exchanges-a-31052">"has no legal impact"</a> on its own case and characterized Epic's framing as "incomplete at best and misleading at worst." LlamaLab CEO Shere Saidon <a href="https://www.healthcaredive.com/news/epic-health-systems-lawsuit-health-gorilla-improper-medical-records-access/809522/">stated</a>: "We have never sold patient data, and we never will."
Both things can be true. Epic can have uncovered real fraud and also be leveraging that fraud to justify the kind of gatekeeping the Cures Act was meant to end.
<h2>What the data tells you</h2>
The categories of data accessed matter enormously. Genetic information, mental health records, and reproductive health data are not just sensitive in the abstract — they are the categories most vulnerable to discrimination, most politically contested, and most consequential if exposed.
In a post-Dobbs landscape where state prosecutors have sought reproductive health records, the idea that an unknown company was systematically accessing this data and passing it to attorneys should alarm anyone working in health privacy. The mass tort business model — where law firms recruit plaintiffs by identifying people with specific medical conditions — creates strong economic incentives for exactly this kind of data harvesting.
The 300,000 figure is what's been documented from Epic-connected providers. The complaint notes an unknown additional volume from the VA and other EHR systems. The full scope may be considerably larger.
Attorney Helen Oscislawski of <a href="https://oscislaw.com/">Oscislawski LLC</a>, a health data privacy specialist, <a href="https://www.bankinfosecurity.com/telehealth-firm-to-be-barred-from-data-exchanges-a-31052">told BankInfoSecurity</a> these cases "underscore legal weaknesses in the current health data interoperability and exchange environment that create opportunities for exploitation."
<h2>What comes next</h2>
TEFCA is still early infrastructure. The immediate question is whether HHS moves to close the identity verification gap that enabled the GuardDog scheme, or waits for the courts to sort it out.
The political dynamics are difficult. Information blocking rules were hard-won by patient advocates who spent a decade fighting data silos. The fraud now gives ammunition to hospital systems and EHR vendors that have resisted sharing — not because interoperability is wrong, but because a real breach gives them a legitimate-sounding argument.
The tools that would help are straightforward in concept: stronger identity verification for network participants, audit requirements for record access patterns, and safe harbors for providers who comply in good faith with sharing requests that later turn out to be fraudulent. Whether those tools arrive through regulation or litigation will depend on how quickly HHS responds.
Epic and its provider co-plaintiffs have said directly that "self-policing on the QHIN and other HIE networks is not working." The case continues against Health Gorilla and the remaining defendants.

