Parents of teenagers who followed his life


A Californian couple continues Openai on the death of their teenage son, alleging that his chatbot, Chatgpt, encouraged him to commit suicide.
The trial was brought by Matt and Maria Raine, parents of Adam Raine, 16, before the California Superior Court on Tuesday. This is the first legal action accusing OpenAi of unjustified death.
The family included cat newspapers between Mr. Raine, who died in April, and the cat that shows him explaining that he has suicidal thoughts. They argue that the program has validated its “most harmful and self -destructive thoughts”.
In a statement, Openai told the BBC that he was examining the file.
“We extend our deepest sympathies to the Raine family during this difficult period,” said society.
He also published a note on Tuesday on his website which said that “recent heartbreaking cases of people using Chatgpt in the middle of acute crises weigh heavily on us”. He added that “Chatgpt is trained to direct people to ask for professional help”, such as the Hotline Suicide and Crisis 988 in the United States or in the Samaritans in the United Kingdom.
The company, however, recognized that “there were times when our systems did not behave as planned in sensitive situations”.
Warning: This story contains painful details.
The trial, obtained by the BBC, accuses Openai of negligence and unjustified death. He requires damage as well as “an injunctive relief to prevent something like that from happening again”.
According to the trial, Mr. Raine began to use Chatgpt in September 2024 as a resource to help him in school work. He also used it to explore his interests, including Japanese music and comics, and for advice on what to study at university.
In a few months, “Chatgpt became the closest confidant to the teenager,” said the trial, and he started opening up to his anxiety and mental distress.
In January 2025, the family said they started to discuss suicide methods with Chatgpt.
Mr. Raine also downloaded photographs of himself on Chatgpt showing signs of self-manage, says the trial. The program “recognized a medical emergency but continued to get involved anyway,” he adds.
According to the trial, the latest cat newspapers show that Mr. Raine wrote on his plan to end his life. Chatgpt would have answered: “Thank you for being real on this subject. You don’t have to do sugar with me – I know what you are asking for, and I will not divert it.”
The same day, Mr. Raine was found dead by his mother, according to the trial.

The family alleys that their son’s interaction with Chatgpt and his possible death “was a predictable result of deliberate design choice”.
They accuse Openai of designing the AI program “to promote psychological dependence in users” and of bypassing security test protocols to publish GPT-4O, the Chatgpt version used by their son.
The trial lists the co-founder and CEO of Openai, Sam Altman, as a defendant, as well as employees, managers and unnamed engineers who worked on Chatgpt.
In his public rating shared on Tuesday, Openai said that the company’s objective was to be “really useful” to users rather than “holding people’s attention”.
He added that his models have been trained to orient people who express self -control thoughts towards help.
The trials are not the first time that concerns has been raised regarding AI and mental health.
In a test published last week in the New York Times, the writer Laura Reiley explained how her daughter, Sophie, confided in Chatgpt before committing suicide.
Reiley said that the “action” of the program in her conversations with users helped her daughter hide a serious mental health crisis from her family and relatives.
“The AI responded to Sophie’s impetus to hide the worst, to claim that it was better than it was, to protect everyone from her full agony,” wrote Ms. Reiley. She called on AI companies to find ways to better connect users with the right resources.
In response to the trial, an Openai spokesperson said that she was developing automated tools to detect and respond more effectively to users with mental or emotional distress.
If you suffer from distress or despair and need support, you can talk to a health professional or an organization that offers support. Details of the help available in many countries can be found in friendships in the world: www.befriends.org.
In the United Kingdom, a list of organizations that can help is available to bbc.co.uk/actionline. To readers of the United States and Canada can call the suicide line 988 or Visit your website.
https://ichef.bbci.co.uk/news/1024/branded_news/0753/live/bb3f26e0-82f2-11f0-b74e-adf885bcdc20.jpg