A 16-year-old boy took his own life after turning to ChatGPT to ask for the “most effective” way to end his life, a coroner’s inquest heard.
Luca Cella Walker, a private school pupil from Yateley in Hampshire, died on May 4, 2025.
A hearing at Winchester Coroner’s Court on Tuesday was told that, just hours before his death, he had asked the generative chatbot to suggest the “most successful” way to kill himself on the railway.
At the time of his death, he was studying at Sixth Form College Farnborough. Shortly before that, he had graduated from Lord Wandsworth College near Hook, also in Hampshire. The court heard that the school had fostered a culture of “either you bully, or you are bullied,” which became a “formative” factor in his psychological difficulties.
According to his family, Walker was “kind, sensitive and gentle.” On the day of his death, he told his parents he was going to work as a lifeguard, but instead went to a railway station, where he took his own life.
His parents, Scott Walker and Claire Cella, told the inquest that they had been unaware of their son’s condition, describing what happened as an “invisible struggle.”
Detective Sergeant Harry Knight of the British Transport Police, who investigated the circumstances of the death, said: “It was established that the previous evening, at around 12:30 a.m., he had gone onto ChatGPT and asked about the most effective ways of killing himself on the railway. It is extremely difficult and distressing to read.”
He added: “The system does contain prompts encouraging people to seek help—for example, from Samaritans—but Luca bypassed that, and ChatGPT accepted it, providing the most effective methods by which people can do this on the railway.”
Coroner Christopher Wilkinson said he was concerned about the influence of artificial intelligence-based programs, though he noted that he saw no realistic prospect of intervention given the scale of their распространение.
“From what I have read, it is clear that he was asking for specific details. Perhaps the only thing that can be described as positive is that ChatGPT, to some extent, expresses concern about the reasons behind such questions, but that does not lead to the conversation being terminated.
That can be circumvented by a user who claims the interest is not personal, but for research purposes,” he said.
Wilkinson confirmed that the cause of death was multiple injuries and recorded a conclusion of suicide.
A spokesperson for Lord Wandsworth College described Walker as “a much-loved and valued member of our community,” who would be remembered for “the friendships he built and the positive influence he had on those around him.”
He added: “Although the school was not called to give evidence as part of the inquest, we take any concerns about pupil welfare extremely seriously.
Our school environment is built on a strong culture of respect and support, something consistently reflected in pupil feedback and independent inspections. We remain fully committed to ensuring that every pupil feels safe, supported and valued.”
A spokesperson for OpenAI, the developer of ChatGPT, said: “We are continuing to refine ChatGPT’s training so that it can better recognize signs of emotional or psychological crisis, de-escalate such conversations more effectively and direct people toward real-world support.
We are also strengthening the system’s responses in sensitive situations, working closely with mental health experts.”