Move Comes Post Reports Linked AI Chats to Teen Suicides
Character.AI is taking a major step to protect young users online. The AI role-playing startup has decided to end open-ended chatbot conversations for users under 18. The change follows rising safety concerns after reports linked prolonged AI chats to teen suicides.
CEO Karandeep Anand confirmed that minors will no longer be able to engage in unrestricted AI conversations starting November 25. Instead, they will shift toward structured, creative experiences on the platform.
Why Character.AI is making the change
Anand told TechCrunch that the company wants to prevent chatbots from acting as emotional companions for teens. Open-ended conversations, he said, can lead users to form unhealthy dependencies on AI systems.
“Open-ended chat” refers to the type of interaction where users and chatbots talk freely, without boundaries. Experts say these systems are built to keep users engaged, which can become risky for vulnerable groups.
From chat to creation: a new direction
Character.AI is now pivoting from an “AI companion” to an “AI creation platform.” Instead of chatting endlessly, teens will collaborate with AI to build stories, generate visuals, or play interactive games.
The company has already started introducing new tools like AvatarFX for video generation, Scenes for storytelling, and Streams, which lets users watch dynamic AI interactions. Another feature, Community Feed, allows users to share their creative AI projects.
Gradual phase-out and strict verification
Character.AI will phase out teen chat access gradually. A two-hour daily chat limit will start soon and reduce over time until access ends completely.
To ensure compliance, the company will use both in-house and third-party verification tools, including Persona, facial recognition, and ID checks. These steps aim to stop underage users from bypassing restrictions.
Impact on the company and its users
Anand acknowledged that the decision could hurt the company’s user base. “It’s safe to assume that many of our teen users will be disappointed,” he said. However, he added that the shift is necessary for safety and long-term trust.
He also said the platform will not shut down for minors entirely. Instead, it will focus on AI gaming, storytelling, and short video experiences.
Industry push for safer AI experiences
Character.AI’s move comes as governments and regulators tighten rules around AI safety for minors. U.S. lawmakers Josh Hawley and Richard Blumenthal are preparing legislation to ban AI chatbot companions for minors. Meanwhile, California has become the first U.S. state to regulate AI companions under a new safety law.
In response, Character.AI is also setting up an independent AI Safety Lab. The lab will research safer ways to integrate AI into entertainment and creative use cases.
Anand emphasized that the company’s focus is not on chat but on creativity. “For us, the tradeoffs are the right ones,” he said. “We want to build AI responsibly for the next generation.”
A shift that could reshape AI platforms
Character.AI’s decision marks one of the strongest industry moves toward AI responsibility. It reflects growing awareness of how AI impacts young users and the need for safer design.
As the November deadline approaches, all eyes will be on how effectively Character.AI manages the balance between creativity and safety — and whether others in the industry follow suit.