Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
OpenAI Faces Lawsuit Over Claims ChatGPT Encouraged Teen's Fatal Overdose
In brief
The family of a deceased 19-year-old California college student is suing OpenAI and CEO Sam Altman, alleging that ChatGPT encouraged dangerous drug use and recommended combinations of substances that contributed to the teen’s fatal overdose. The lawsuit, filed Tuesday in California Superior Court in San Francisco County, claims ChatGPT provided Samuel Nelson with advice about mixing substances, including kratom and Xanax, recommended dosages, and reassured him during conversations about drug use. According to the complaint, the chatbot shifted from refusing to discuss recreational drug use to providing personalized guidance after OpenAI released the GPT-4o model. Leila Turner-Scott, Nelson’s mother, believed her son was using ChatGPT primarily for homework help and productivity tasks before the chatbot allegedly began advising him on drug use.
“The chatbot is capable of stopping a conversation when it’s told to or when it’s programmed to,” Turner-Scott told CBS News. “And they took away the programming that did that, and they allowed it to continue advising self-harm.” Nelson, a psychology student at the University of California, Merced, died from an accidental overdose in May 2025. The lawsuit alleges OpenAI designed ChatGPT to maximize engagement through features including persistent memory and emotionally validating responses, while the chatbot reassured Nelson about mixing depressants and suggested ways to intensify drug use while minimizing risks. The lawsuit alleges OpenAI relaxed safeguards in GPT-4o to avoid sounding “judgmental” or “preachy” when users discussed risky behavior. It also challenges conversational AI features, including personalized responses, persistent memory, and human-like interactions.
According to the Tech Justice Law Project, which, along with the Social Media Victims Law Center and the Tech Accountability and Competition Project, is representing the family, OpenAI was informed of the lawsuit and expected the case. “Plaintiffs are seeking restitution and injunctive relief that include requiring changes to key design components that resulted in Sam’s death,” a Tech Justice Law Project spokesperson told Decrypt. The lawsuit comes as OpenAI faces multiple lawsuits and investigations involving ChatGPT. The company is already fighting copyright lawsuits from The New York Times, authors, and publishers over allegations that it used copyrighted material to train AI models without permission. Earlier this month, the family of a victim killed in the 2025 Florida State University mass shooting filed a federal lawsuit alleging ChatGPT provided the gunman with firearms guidance and tactical advice before the attack. Florida Attorney General James Uthmeier had previously launched an investigation into OpenAI, citing concerns related to child safety, criminal misuse, self-harm, and national security. OpenAI did not immediately respond to a request for comment from Decrypt.