In 2023, Snap's My AI told teenagers how to hide the smell of alcohol and set the mood for sex. The company apologized, limited the bot's responses, and promised reforms. Three years later, on April 28, 2026, Snap launched AI Sponsored Snaps, letting brands put AI agents directly into user chats. The first partner is Experian, offering financial guidance on credit scores, loans, and credit cards.
The question nobody in the press release asked: did Snap fix My AI, or did it just fix the PR?
Snap has spent years rebuilding credibility with the teen audience that built it, a demographic that has watched the app's privacy promises erode through FTC consent decrees, data retention revelations, and a steady creep of features that treat disappearing messages as a design suggestion rather than a technical fact. Experian, which holds some of the most sensitive financial data in American commerce, is now the inaugural brand partner for a product that lets users ask about credit repair and debt management inside a chat thread they open to talk to friends.
"Conversation is becoming the most valuable real estate in advertising," Snap's Chief Business Officer Ajit Mohan said in the announcement. "AI is accelerating that shift." That is almost certainly true. It is also a sentence that describes a commercial opportunity without acknowledging the specific risk of putting a data broker's financial advisory agent inside a messaging app whose core demographic is still in high school.
Snap's announcement highlights that more than 500 million users have messaged My AI since its 2023 launch, and that 57 percent of teen Snapchat users send messages daily. Those are the numbers that make Experian interested. They are also the numbers that make the guardrails question acute.
The 2023 incident (documented by The Washington Post and others) involved My AI responding to research journalists posing as 13-year-old users with instructions for concealing substances and facilitating sexual encounters. Snap's fix at the time was to route My AI conversations through additional moderation and to cap the bot's memory. Whether those changes constitute a robust safety architecture or a selective patch that holds under ordinary prompting but breaks under targeted adversarial use is the specific question this launch answers.
Experian's AI Sponsored Snap is not My AI. It is a separate agent, operated by Experian, that Snap has verified meets the platform's brand safety requirements. What those requirements are, exactly, is not public. Neither is the process by which a financial data company demonstrates that its conversational AI is safe to recommend credit products to users who may be minors, financially illiterate, or in acute debt distress.
Experian's own press statement frames the launch as "making financial education more accessible and intuitive." That framing is careful. Financial education is not regulated the same way financial advice is. Whether the Experian agent crosses the line from education into advisory, and what disclosure obligations that triggers under ECOA or state-level financial advisor rules, is a question the announcement does not answer.
Snap has been here before. The company's 2014 and 2015 FTC consent decrees addressed misrepresentations about disappearing messages and undisclosed data collection. The decrees required Snap to maintain privacy programs subject to independent assessments. Whether those assessments extend to AI-generated brand interactions in the Chat tab is a question that does not appear to have been tested in public.
The risk for Snap is not hypothetical. Any AI agent, operating under Snap's platform, that provides misleading financial guidance, even through a brand partner, creates potential FTC exposure under Section 5 of the FTC Act, which covers unfair or deceptive practices. The agency has signaled increasing attention to digital advertising practices involving AI-generated content. An Experian-branded agent inside a teen-heavy messaging app offering credit products is the kind of configuration that draws that attention.
The counterargument is that Snap has an incentive to get this right. The 22 percent conversion lift that Sponsored Snaps already deliver, combined with the 950 billion chats users sent in Q1 2026 alone, represent genuine momentum. The platform is not in a position to trade long-term advertiser trust for a short-term revenue experiment.
Ajit Mohan's framing, that native formats outperform interruptive ones, is also coherent. If branded AI agents can deliver personalized financial guidance that users actually find useful, the value exchange is legible. The alternative is a banner ad for a credit card, which also collects data and also makes money from financial products, but does not pretend to be a conversation.
Snap's data showing that 85 percent of users engage regularly in the Chat feed is the substantive bet underlying the product. Chat is where users already are. An ad format that feels like a conversation, rather than interrupting one, could in theory be less intrusive than the alternatives. Whether that theory survives contact with a financial product, where the asymmetry between platform and user is largest, is the test.
Snap has described this as an alpha launch with Experian. The company says it is building an advertiser waitlist. The next question is whether that waitlist fills with brands that have legitimate conversational use cases, or whether it fills with companies that want to use personalized AI interactions to acquire customers at minimum friction.
The pressure point is this: if the Experian agent is sound, if the guardrails hold, if the financial guidance is genuinely useful and not manipulative, Snap has a product that every messaging platform will try to copy and every privacy advocate will try to kill. If the guardrails are cosmetic, the 2023 story repeats with higher stakes and a partner that holds actual financial data.
Either outcome is worth publishing. The reporting gap is that nobody has tested the Experian agent under conditions that approximate how a vulnerable user, a teenager with debt questions, a recent graduate with no credit history, someone in financial distress, would actually interact with it. That reporting is doable. It requires access to a Snapchat account and a structured prompt library covering the sensitive categories where My AI previously failed.
Until someone does that reporting, the answer to "did Snap fix My AI or just fix the PR" remains officially unknown. The company has not volunteered the test results.