AI Industry Is Using China Threat Claims to Avoid Domestic Regulation
A dark-money group with ties to OpenAI, Palantir, Andreessen Horowitz, and Perplexity is paying TikTok influencers to present its political agenda as spontaneous concern about Chinese AI.
The group is called Build American AI. It is a 501(c)(4) nonprofit organization, which means it does not have to disclose its donors. Through a super PAC called Leading the Future, the same network of tech executives has assembled $140 million in total contributions and commitments, with $51 million still available to spend as of April 2026, according to the group's own disclosures. That is more than the Federal Trade Commission's entire budget for AI oversight.
Build American AI hired an influencer marketing agency called SM4 to recruit content creators. The rate was $5,000 per TikTok video, paid to influencers who were not required to disclose that they were speaking on behalf of a funded political operation. SM4 described the goal as subtly shifting public debate.
The campaign ran in two phases. The first phase, according to WIRED's reporting, involved lifestyle influencers promoting artificial intelligence as a general positive force. The second phase pivoted entirely to China. SM4 supplied scripted talking points about Chinese AI capabilities and the threat they supposedly pose to American interests. Influencers delivered those talking points as if they were personal observations. The disclosure that the content was paid political messaging did not appear.
This is not a policy argument about China. It is a political operation that discovered national security is useful cover for exempting the AI industry from domestic accountability.
Here is the mechanism. The same companies that fund Leading the Future are the companies most exposed to federal AI regulation, antitrust scrutiny, and export controls. OpenAI, Palantir, a16z, and Perplexity have all advocated for permissive AI development policies. A super PAC with $51 million to spend can amplify those positions through legitimate channels, disclosed advertising, and direct political contributions. Instead, Build American AI is routing money through a nonprofit with no donor disclosure, paying influencers to deliver scripted content as if it were organic opinion.
Jamie Cohen, who studies technology and democracy at Queens College, CUNY, told WIRED that undisclosed financial relationships between AI companies and the influencers shaping public understanding of AI are "extremely corrosive to democracy." That is a precise description of what happened here.
The disclosure question remains unresolved. Federal election law requires political advertising to identify its sponsor. Federal trade regulations require material connections between advertisers and endorsers to be disclosed. Neither requirement has a clear answer when the funder is a nonprofit that does not have to name its donors and the endorsement comes from a creator who was paid but disclosed only that the content was an advertisement, not who paid for it or why.
Fifty-three percent of US adults get at least some of their news from social media, according to Pew Research Center data. Thirty-eight percent of adults under 30 regularly get news from influencers. Those are the audiences this campaign was designed to reach. A 30-second TikTok delivered by someone who appears to have formed their own view, but who was actually given a script and paid $5,000 to deliver it, is a different kind of political messaging than a disclosed advertisement. It is designed to be indistinguishable from ordinary speech.
The companies involved have not disputed that their associates support Leading the Future. Greg Brockman of OpenAI, Joe Lonsdale of Palantir, Andreessen Horowitz, and Perplexity are all named in the group's own public announcement. NOTUS described Leading the Future as a massive political war chest for the AI industry. What Build American AI's 990 filing will reveal about the precise donor chain, and what FEC itemized disbursements will show about the SM4 payment trail, are questions still outstanding.
The FTC has not issued a determination on whether current disclosure practices for this type of campaign are adequate. That question matters. If "labeled as an ad" is sufficient disclosure when the funder is a dark-money nonprofit, then every campaign finance structure that uses a nonprofit layer achieves the same anonymity. That is not a theoretical risk. It is the structure Build American AI chose, and it is the structure that is now available to any other industry that wants to manufacture organic-seeming political content at scale.
The WIRED investigation by Zeyi Yang and Louise Matsakis was published May 1, 2026.