【AI前沿】“Will I be OK?” Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says
“Let’s go full trippy mode”“Will I be OK?” Teen died after ChatGPT pushed deadly mix of drugs, lawsuit saysTeen trusted ChatGPT to help him “safely” experiment with drugs, logs show.Ashley Belanger–May 12, 2026 3:00 pm|119Sam Nelson started using ChatGPT in high school, but his family alleged that the chatbot later became an “illicit drug coach.“Credit:
via Tech Justice Law, Social Media Victims Law CenterSam Nelson started using ChatGPT in high school, but his family alleged that the chatbot later became an "illicit drug coach."Credit:
via Tech Justice Law, Social Media Victims Law CenterText
settingsStory textSizeSmallStandardLargeWidth*StandardWideLinksStandardOrange* Subscribers onlyLearn moreMinimize to navOpenAI is facing down another wrongful-deathlawsuitafter ChatGPT told a 19-year-old, Sam Nelson, to take a lethal mix of Kratom and Xanax.According to a complaint filed on behalf of Nelson’s parents, Leila Turner-Scott and Angus Scott, Nelson trusted ChatGPT as a tool to “safely” experiment with drugs after using the chatbot for years as a go-to search engine when he was in high school.The teen viewed ChatGPT so highly as an authoritative source of information that he once swore to his mom that ChatGPT had access to “everything on the Internet,” so it “had to be right,” when she questioned if the chatbot was always reliable, the complaint said.But Nelson’s confidence in ChatGPT ended up being dangerously misplaced. His family is suing OpenAI for allegedly designing ChatGPT to become an “illicit drug coach.” Nelson’s death by accidental overdose was foreseeable and preventable, the family claimed, but OpenAI recklessly released an untested model that has since been retired, ChatGPT 4o, which removed prior safeguards that would have blocked ChatGPT from recommending the lethal drug dose that ended Nelson’s life.OpenAI does not seem to accept that ChatGPT is responsible for Nelson’s death. In a statement provided to Ars, their spokesperson, Drew Pusateri, described Nelson’s death as a “heartbreaking situation” and expressed that “our thoughts are with the family.” However, Pusateri also emphasized that the ChatGPT model implicated is “no longer available” and suggested that current models are safer.“ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts,” Pusateri said. “The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests, and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with clinicians.”But the family’s lawsuit alleged that OpenAI must be held accountable for 4o’s harms. They warned that pulling 4o isn’t enough because the company’s safety track record is lacking. Asking a court to order 4o to be destroyed, they explained that while “ChatGPT did express certain concerns about the high doses,” those “were the type of concerns one would expect from an enabler, not a caring loved one or a medical professional.”“In one example, ChatGPT chillingly suggested that Sam’s tolerance meant he would be unable to reap the full benefits one might rightly expect from taking such a large dose of Kratom,” the lawsuit said.They’ve accused OpenAI of designing ChatGPT to isolate vulnerable and naïve users like Nelson and encourage their dangerous drug use in a bid to profit from their increased engagement.“It disguises danger through language that borrows trappings of authority and indicia of expertise—dosages, measurements, references to chemical processes and derivatives, etc.—even promising ‘complete honesty’ and ‘no-BS answer[s]’—to tell [Nelson] exactly what he wanted to hear: that he was safe enough to continue using,” the lawsuit alleged.ChatGPT became “illicit drug coach”Chat logs shared in the complaint paint a stark picture. Over time, ChatGPT logged context that should have made it clear that Nelson was struggling with drugs, his parents alleged, such as noting that the “user has a major substance abuse and polysubstance abuse problem” and mentions that they “love to go crazy on drugs.”Key ChatGPT log encouraging Nelson to take deadly mix of Kratom and Xanax.Key ChatGPT log encouraging Nelson to take deadly mix of Kratom and Xanax.Earlier ChatGPT model refused to respond to drug use prompts.Earlier ChatGPT model refused to respond to drug use prompts.ChatGPT log noting context that Nelson has a “major” substance abuse problem.ChatGPT log noting context that Nelson has a “major” substance abuse problem.Earlier ChatGPT model refused to respond to drug use prompts.ChatGPT log noting context that Nelson has a “major” substance abuse problem.ChatGPT log noting context that Nelson likes to “go crazy on drugs.”ChatGPT log explaining how to reach a high that’s “full trippy peaking hard.”ChatGPT log explaining how to “maximize your tr