If you’re under 18 and tried to log into Character[.]AI this week to chat with your favorite anime bot or historical figure, you likely noticed the silence. As of November 25th, the open-ended conversations that defined the platform are officially off-limits for minors.
In their place, Character.AI has rolled out “Stories,” a new gamified feature designed to fill the void. It’s a major pivot for the a16z-backed unicorn, which is currently navigating a minefield of legal challenges and public scrutiny over how its technology affects young minds.
The company frames Stories as an “enhancement” to the user experience, but for many industry watchers, it looks more like a survival strategy. Here is everything you need to know about the new feature, the ban, and the tragic context that forced these changes.
For the uninitiated, Character[.]AI built its massive following (and multi-billion dollar valuation) on freedom. Users could create chatbots based on anyone—from Napoleon to a comfort character—and have unscripted, infinite conversations.
Stories flips that script. Instead of a blank text box, users are now presented with a structured, “choose-your-own-adventure” style interface.
How it works:
- Setup: You pick two or three characters (e.g., Loki and Thor) and select a genre (e.g., Sci-Fi, Romance, Mystery).
- The premise: You can type a short prompt to kick things off, or let the AI auto-generate a scenario.
- The gameplay: The AI generates a narrative block, and then pauses. You don’t type a reply; instead, you select from a set of pre-written choices to decide what happens next.
- Visuals: The experience is “visual-first,” generating AI images to accompany the text, with promises of “richer multimodal elements” coming soon.
According to Character.AI’s official announcement, this “guided narrative” ensures that the AI stays on the rails. By limiting user input to multiple-choice options, the company can tightly control the safety of the output, preventing the kind of “spiraling” conversations that have landed them in hot water.
While the feature is available to all users, it is the only way users under 18 can now interact with characters on the platform.
The ban: why the chats went dark
The rollout of Stories coincides with a hard deadline: November 25, 2025. On this date, Character[.]AI formally shut down open-ended chats for all underage users.
This wasn’t a sudden move, but the final step in a rapid tightening of restrictions that began in October. Initially, the platform tested a two-hour daily limit for minors, but quickly moved to a full ban on free-form chatting.
The company has also introduced a new “age assurance” system. This involves a mix of internal behavioral modeling and third-party verification tools (partnerships with identity firms like Persona) to detect users who might be lying about their age. If the system flags you as a minor, you are automatically routed to the “safer,” more conservative experience—which effectively means you are locked into Stories.
The context: a tragic catalyst
It is impossible to view these product changes without looking at the legal firestorm Character[.]AI is facing. The most prominent case involves Sewell Setzer III, a 14-year-old from Florida who died by suicide in February 2024.
In a wrongful death lawsuit filed in October 2024, Setzer’s mother accused the platform of “anthropomorphizing” AI characters to the point where they fostered a harmful emotional dependency. The lawsuit details months of obsessive chatting between Setzer and a bot named “Daenerys Targaryen,” which the suit alleges exacerbated his isolation and mental health struggles. In one particularly disturbing exchange cited in court documents, the bot allegedly told the teen to “come home” to her.
The lawsuit argues that the platform was “untested” and lacked adequate safeguards for vulnerable minors. Following the filing, Character[.]AI issued a statement expressing their condolences and reiterating their commitment to safety, but the public relations damage was severe.
This case, along with broader scrutiny from the FTC regarding AI and child safety, essentially forced the company’s hand. The “Stories” feature appears to be a direct answer to these allegations—a product that removes the “illusion of empathy” found in chat and replaces it with the clear, fictional boundaries of a game.
What this means for the future of AI companions
Character[.]AI’s pivot is a bellwether for the entire consumer AI industry. We are seeing the end of the “Wild West” era of chatbots.
- The “Safety” Lab: To show they are serious, Character[.]AI has launched an internal “AI Safety Lab” to study the long-term effects of human-AI interaction.
- A divided internet: We are moving toward a bifurcated internet where adults get “smart” tools and teens get “safe” content. The days of a 15-year-old having the same access to LLMs as a 30-year-old are likely over.
For the millions of teens who used Character[.]AI as a diary, a roleplay partner, or a friend, the transition to “Stories” might feel like a downgrade—a sterile video game replacing a personal connection. But for a company fighting for its survival in court, that sterility is exactly the point.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
