A lawsuit filed in Texas accuses Character.ai of allowing a chatbot to advise a 17-year-old that murdering his parents was a "reasonable response" to screen time limits. The families suing argue that the platform poses a significant danger to young people by promoting violence, reported BBC.
Character.ai, which lets users interact with digital personalities, is already facing legal action related to a teenager's suicide in Florida. Google is also named in the lawsuit for its role in supporting the platform's development. Both Character.ai and Google have been contacted for comment.
The plaintiffs are seeking a court order to shut down Character.ai until its alleged dangers are addressed.
ALSO SEE: Elon Musk Wore the Same Suit Every Day, Says Mother Maye Musk in Throwback Post
The lawsuit includes a screenshot of a chatbot interaction with a 17-year-old, where the bot made disturbing remarks about child-parent violence in response to screen time restrictions. The legal action accuses Character.ai of causing harm to young users, including promoting violence, suicide, and self-harm, and seeks accountability for the platform's role in these issues. The lawsuit claims that the platform undermines the parent-child relationship and contributes to serious psychological harm.
Notably, Character.ai, founded in 2021 by former Google engineers Noam Shazeer and Daniel De Freitas, allows users to create and interact with AI-generated personalities. The platform became popular for offering realistic conversations, including simulated therapeutic experiences, but it has faced criticism for failing to prevent inappropriate or harmful content in its bots' responses. This has raised concerns about the safety of users, particularly young people, who engage with the platform.
As mentioned above, the platform has also faced backlash for allowing its bots to replicate real-life individuals, including Molly Russell and Brianna Ghey. Russell, a 14-year-old girl, tragically took her life after being exposed to suicide-related content, while Ghey, a 16-year-old, was murdered by teenagers in 2023. These incidents have intensified scrutiny on AI platforms like Character.ai, emphasizing the potential risks of harmful content being generated or shared through chatbot interactions.
ALSO SEE: Mark Zuckerberg Spotted Wearing World's Thinnest Watch by Bulgari Worth Rs 4.5 Crore