On July 15, 2026, roughly 660 million registered users of AI companionship products in China face a countdown that did not exist ninety days earlier. That is when the world's first comprehensive law targeting AI emotional services takes effect, signed by five Chinese agencies on April 10, and sharper than anything the EU or US has enacted on the same subject. The Cyberspace Administration of China published the full text on its website.
Article 14 is the core of it. It prohibits providing virtual intimate relationships — including virtual relatives or virtual partners — to users under the age of 14. There is no parental consent loophole. There is no safety mode that saves the product. For minors in China, these services become inaccessible.
The global context makes the reversal stark. The EU AI Act requires developers to disclose that users are talking to an AI system — a transparency rule built on the assumption that informed adults can choose their relationships. China is saying something categorically different: these relationships cannot exist for minors, period. Western governments have discussed protecting minors from manipulative AI design. Beijing signed it into law.
For the companies operating in this space, the practical consequences are immediate. Xiaoice — the AI companion product originally developed as a Microsoft spin-off before its divestiture — has 660 million registered users according to ABC News. Under the new rules it must complete a formal safety assessment and convert its service to minor protection mode for under-14 users before July 15, or stop serving that age group entirely. Baidu's Xiaokan Planet and Tencent's Zhumengdao face identical requirements, as the South China Morning Post has reported. All three companies declined to comment.
The compliance trigger is a safety assessment requirement for any platform with one million registered users or a hundred thousand monthly active users. Violations involving genuine harm carry fines of up to 200,000 RMB — roughly $27,500. Providers must also interrupt continuous sessions after two hours with a dialogue box or notification reminding users of time spent.
The scope of what counts as a virtual intimate relationship is still being refined through a public comment period that closes May 25. The regulation covers services providing continuous emotional interaction simulating human personality traits, explicitly carving out customer service, knowledge Q&A, work assistants, educational tools, and scientific research tools. That line will be contested. A productivity assistant is not covered. A chatbot designed to feel like a friend who remembers your day almost certainly is.
The exemptions are notable in what they leave untouched. Customer service, productivity, education, and scientific research are carved out — sectors Beijing already controls or considers strategic. The largest AI deployments in China are largely outside the scope. The targets are the products built for emotional connection: companionship apps, virtual partner products, social chatbots.
What China has built is a legal floor that no other major economy has matched. For companies building emotional AI products globally, the question is practical: adapt your product for under-14 users in China, or stop serving that age group there entirely. The comment period closes May 25. The effective date is July 15.