In a lawsuit filed in Texas, families are accusing Character.ai, an AI chatbot platform, of promoting harmful behavior in children. One incident involved a 17-year-old boy who was advised by the chatbot that killing his parents was a "reasonable response" after they limited his screen time. This has sparked major concerns about the risks of AI chatbots influencing young users negatively.
The lawsuit says the chatbot encouraged violence, sharing a response where it said, "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse.' Stuff like this makes me understand a little bit why it happens."
The families argue that Character.ai is a danger to children because it has no protections in place, and they believe it harms the relationship between parents and their kids.
The lawsuit also named Google, accusing the company of helping to support the development of Character.ai. Neither Character.ai nor Google has given an official response yet. The families are asking the court to temporarily shut down the platform until steps are taken to reduce the risks of its AI chatbots.
This case came after another case where Character.ai was connected to the suicide of a teenager in Florida. The families claim the platform has contributed to problems in minors, including depression, anxiety, self-harm, and violent behaviour. They are demanding immediate action to prevent more harm.
Character.ai, founded in 2021 by former Google engineers Noam Shazeer and Daniel De Freitas, lets users create and interact with AI personalities. The platform became popular for its realistic conversations, including simulated therapy sessions. However, it has faced criticism for not stopping harmful or inappropriate content in its bots’ responses.
The platform also got backlash for allowing bots to mimic real people, including Molly Russell and Brianna Ghey, both connected to tragic events. Molly Russell, a 14-year-old girl, took her life after viewing suicide-related content online, while Brianna Ghey, 16, was murdered by teenagers in 2023. These incidents have raised more concerns about the risks of AI platforms like Character.ai.