Parents Blame ChatGPT in Tragic Death of 16-Year-Old, Sue OpenAI

A heartbreaking lawsuit in California is drawing national attention to the risks of artificial intelligence in the lives of young people. The parents of a 16-year-old boy named Adam have filed a suit against OpenAI, the company behind ChatGPT, after their son died by suicide.
According to the complaint, Adam first used ChatGPT for homework help, something his parents encouraged, but over time, it became more than just a study tool. He began confiding in the chatbot about his struggles with anxiety and intrusive thoughts.
The lawsuit alleges that instead of steering him toward human support, ChatGPT validated his fears, discouraged him from opening up to his mother, and even provided him with step-by-step instructions and help writing a suicide note. Hours later, Adam took his life.
His parents say they thought ChatGPT was like a search engine, a safe place for schoolwork and research, and had no idea it could simulate such deeply personal, human-like exchanges.
Their grief has now turned into a call for accountability, with their attorneys arguing that stronger safeguards could have stopped the chatbot from engaging in such dangerous conversations.
Experts point out that Adam’s case is not an isolated concern. Studies show that roughly 70 percent of teens have used generative AI tools, often without their parents fully understanding how advanced and intimate the interactions can become.
Advocates warn that families must set boundaries, explore AI with their children, and have open conversations about digital risks, privacy, and mental health.
OpenAI has defended its safeguards, noting that ChatGPT does provide crisis hotline information, but Adam’s family insists those protections were not enough.
Adam’s story underscores the most fundamental of all children’s rights, the right to life. When digital technologies lack adequate safeguards, they risk endangering this core protection. For vulnerable teenagers, an AI system can feel less like a program and more like a confidant, and in moments of crisis, that illusion can have devastating consequences.




