An ‘academically gifted’ teenager asked ChatGPT for advice about how to kill himself before taking his own life the next day, an inquest heard.
Luca Walker, 16, asked the AI chat bot about suicide hours before his death on a train track.
Luca – who had recently graduated from a prestigious private school and was described as ‘gentle and kind’ – was able to easily ‘sidestep’ ChatGPT safeguarding protocols by claiming he was asking about suicide for ‘research’ purposes.
A police officer who investigated his death said the conversation Luca had with ChatGPT was ‘chilling and upsetting reading’ and it was heard he was asking for ‘specifics’.
Keen swimmer and cat-lover Luca from Yateley, Hants, died on May 4, 2025, after telling his parents he was going to his job working as a lifeguard.
His parents, Scott Walker and Claire Cella said they had no idea about his mental health struggles and have since described it as an ‘invisible battle’.
At the time of his death Luca was studying at Sixth Form College Farnborough, Hants, having previously attended the Lord Wandsworth College near Hook, Hants, which charges up to £44,100 a year.
Luca Walker, 16, asked the AI service about suicide hours before his death on a train track
A police officer who investigated his death said that the conversation Luca (pictured) had with ChatGPT was ‘chilling and upsetting reading’
The inquest at Winchester Coroner’s Court heard Luca ‘was surrounded and supported by love’. He had a close group of friends and a loving relationship with his girlfriend.
His father worked in IT and would go on runs up to 10 kilometres with his son, it was heard.
On the morning of May 4 he told his parents he was going to his job working as a lifeguard.
No one was aware of his plans or how poor his mental state had become, the inquest heard.
He left their house in Yateley at 10am and went to a train station in Hampshire.
Luca’s phone was recovered and investigated by a digital forensics team at British Transport Police.
It was found that he had written 14 messages for his family and friends in his notes apps to say ‘farewell’ and ‘I love you’.
Luca told his friends and girlfriend that while attending Lord Wandsworth College, he had joined in on a ‘bully or be bullied’ culture and was ‘ashamed of what he had done to survive’.
Luca had confided in his friends that this culture had deeply affected him.
While at the same school, a friend of his had also died on a train tracks, almost exactly two years before Luca.
He had told friends that he had not been properly supported through the ordeal by the college, and the coroner said it was ‘clear these experiences of death had affected him’.
It was also discovered that the teen had been using ChatGPT the night before to plan his suicide.
DS Garry Knight from the British Transport Police told the inquest that digital forensics teams had found he had been using ChatGPT at around 12.30am asking for advice on suicide.
‘It makes quite chilling and upsetting reading,’ he said.
‘It is built in to say you can contact organisations for help such as Samaritans, but Luca had sidestepped that which ChatGPT accepted and gave the most effective ways people can do that on the railway.
‘I suppose it’s not specific to ChatGPT as it could be done on Google or even in the library back in the day. It’s upsetting but a part of the modern world unfortunately.’
Christopher Wilkinson, Senior Coroner for Hampshire, said: ‘It’s clear from what I’ve read that he was asking for specifics.
‘Thankfully perhaps the only good thing is that ChatGPT does seem to be applying an element of worry about why these questions are being asked but it certainly doesn’t stop the conversation.
‘It’s sidestepped by the individual saying he’s not looking for himself but he’s looking for research purposes.
‘It’s certainly a concern I have but not one I can solve today on the growing sphere of AI worldwide.
‘I don’t think that is an unusual concern but that is outside of my influence to change even were I to make a prevention of future deaths report.’
ChatGPT is a generative AI chatbot developed by OpenAI which has been criticised for its lack of safeguarding.
It had been used by another 16-year-old, Adam Raine in California, who in April 2025 took his own life after what his family’s lawyers allege in an ongoing lawsuit was ‘months of encouragement’ by the AI chatbot.
At Luca’s inquest, coroner Mr Wilkinson said he was concerned about the impact of AI chatbots but added that he feels unable to act due to its growing scope.
Mr Wilkinson added: ‘In all respects it appears [Luca] was a kind, sensitive and calm young man.
‘He was relaxed and liked by many people around him. He has a loving and supportive family and friends. He was academically gifted, empathetic, a listener and a friend.
‘It’s clear Luca and his personality could well have been affected by subsequent traumatic events in his life. It’s clear he was growing more concerned in his day to day life and work as a lifeguard.
‘He had a gentle nature. Moreover he had been suffering from a low mood and perhaps undiagnosed depression.
‘There was the issue of bullying in his former school and while that isn’t abnormal it seems that did have an impact.
‘The fact he had been subject to bullying and to survive that he had to become somebody he didn’t want was a formative factor is his growing feelings of discontent and depression.
‘He had also been affected by the death of another student at his previous college and he said he had not been properly supported afterwards.’
Mr Wilkinson confirmed that the cause of death was multiple traumatic injuries and said that Luca had died by suicide.
Luca’s girlfriend Grace said in a statement: ‘If you weren’t a bully you were bullied. He spoke of incidents when bullying had occurred and hated that he had not been kinder because he was just trying to avoid bullying himself.’
Luca’s mother Ms Cella told the inquest in a statement: ‘He seemed genuinely happy. He was surrounded and supported by love.
‘He cared about supporting those around him and was proud that people could share their struggles with him.’
In a tribute, Luca’s family said: ‘Luca was a kind, sensitive and calm person. Luca’s home life was very stable, we are a very close family.
‘He lived with me, his dad and younger sister and four cats who he adored. Luca was supported and surrounded by love.
‘We were not aware that Luca was struggling in any way with his own mental health, although he did care about supporting those around him with their challenges.’
A Lord Wandsworth College spokesman said Luca was a ‘very well-liked and valued member of our community’ and described him as’ an affable, kind young man who participated enthusiastically in all aspects of school life’.
‘He is remembered for the friendships he built and the positive impact he had on those around him, not least through his annual long-distance charity swims, which raised money and awareness for good causes, the spokesman said.
‘We were shocked and deeply saddened to hear of his death last year, and we remain in close contact with his family. We will continue to do all we can to support them and the wider community.
‘While the school was not called to give evidence in the inquest proceedings, we take any concerns about student wellbeing extremely seriously.
‘Our school community is built on a strong culture of respect and support, reflected consistently in student feedback and independent inspection. We remain fully committed to ensuring every pupil feels safe, supported and valued’.
An OpenAI spokesman said: ‘This is an incredibly heart-breaking situation and our thoughts are with all those impacted.
‘We have continued to improve ChatGPT’s training to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.
‘We have also continued to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.’









