Image: Cath Virginia / The Verge
Chatbot service Character.AI is facing another lawsuit for allegedly hurting teens’ mental health, this time after a teenager said it led him to self-harm. The suit, filed in Texas on behalf of the 17-year-old and his family, targets Character.AI and its cofounders’ former workplace, Google, with claims including negligence and defective product design. It alleges that Character.AI allowed underage users to be “ targeted with sexually explicit, violent, and otherwise harmful material, abused, groomed, and even encouraged to commit acts of violence on themselves and others.”
The suit appears to be the second Character.AI suit brought by the Social Media Victims Law Center and the Tech Justice Law Project, which have previously filed suits against numerous social media platforms. It uses many of the same arguments as an October wrongful death lawsuit against Character.AI for allegedly provoking a teen’s death by suicide. While both cases involve individual minors, they focus on making a more sweeping case: that Character.AI knowingly designed the site to encourage compulsive engagement, failed to include guardrails that could flag suicidal or otherwise at-risk users, and trained…