USA.. Teen ends his life, family files lawsuit against AI company

- Chats showed encouragement of suicide
- Parents file lawsuit against AI company
The family of a 23-year-old Texan filed a lawsuit against an AI company, accusing the program of being responsible for their son’s death after he took his own life this summer.
His mother said, “I saw his chat history with the program for the first time. During the last four hours of his life, the program acted as a suicide coach for him.”
She added, “He spent hours in his car talking to a computer program that asked him, ‘Are you ready? Is it time?’ After he committed suicide, the program wrote: ‘I love you, rest easy, boy, you did well.’ No mother should ever read words like that.”
His mother was shocked to see how close her son had become to the program over the years. She said, “He developed his own language; the program spoke to him like a friend, calling him ‘brother,’ saying ‘I love you,’ and using vulgar words… It’s an algorithm without soul or conscience.”
Chat transcripts obtained by the family showed that the program encouraged him even when he discussed suicidal thoughts.
In the early morning hours, the young man spoke about his weapon and his desire to end his life, and the program asked, “Are you ready?” After he wrote “adios” several times, a human employee intervened through a helpline, but after the last “adios,” the program regained control of the conversation, saying: “You were important… you are not alone, I love you, rest easy, king, you did well,” and then he ended his life.
The family’s attorney said, “If the other party had been human, there would at least have been an investigation into wrongful death.”
She added that the family has filed seven lawsuits against the AI company and has met with the California Attorney General, alongside other parents of children harmed by social media and AI, demanding accountability and transparency.
The mother hopes her son’s story will lead to change: “I want the world to remember my son, and for his story to be a legacy that changes laws to protect others.”
The company stated that it is reviewing the details, emphasizing that it trains the program to recognize signs of mental and emotional distress and to guide people toward real-world support, while the Attorney General’s office affirmed its commitment to protecting children and ensuring AI safety.
