subscribers. Become an Insider
and start reading now.
Have an account? .
- Character.AI is banning users under 18 from interacting with its chatbot.
- It follows concerns about AI’s impact on young people and a lawsuit over a 14-year-old’s suicide.
- Character.AI cited feedback from regulators, safety experts, and parents for the decision.
Character.AI is banning users under 18 from engaging in conversations with its chatbots after facing scrutiny over how young people interact with its virtual companions.
The California-based startup announced on Wednesday that the change would take effect by November 25 at the latest and that it would limit chat time for users under 18 ahead of the ban.
It marks the first time a major chatbot provider has moved to ban young people from using its service, and comes against a backdrop of broader concerns about how AI is affecting the millions of people who use it each day.
Founded in 2021, Character.AI hosts virtual avatars that can take on the persona of real or fictional people.
Character.AI said in a blog post that it was making the change after receiving feedback from “regulators, safety experts, and parents.”
The company did not immediately respond to a request for comment from Business Insider.
In February 2024, 14-year-old Sewell Setzer III died by suicide after talking with one of Character.AI’s chatbots. His mother, Megan Garcia, filed a civil lawsuit against the company in October that year, blaming the chatbot for her son’s death, alleging negligence, wrongful death, and deceptive trade practices. Character.AI has previously declined to comment on the litigation.
Earlier this month, Character.AI took down a chatbot based on paedophile Jeffrey Epstein, following a story from the Bureau of Investigative Journalism.
If you or someone you know is experiencing depression or has had thoughts of harming themself or taking their own life, get help. In the US, call or text 988 to reach the Suicide & Crisis Lifeline, which provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations. Help is also available through the Crisis Text Line — just text “HOME” to 741741. The International Association for Suicide Prevention offers resources for those outside the US.

