What did my daughter say to ChatGPT before committing suicide?

robot
Abstract generation in progress

ChatGPT's recent accusation of contributing to teen suicides has sparked much discussion, exposing safety and ethical gaps in AI mental health applications. This article is compiled from New York Times author Laura Reiley What My DaughterTold ChatGPT Before She Took Her Life. (Synopsis: ChatGPT was prosecuted for assisting in "youth suicide", OpenAI responded) (Background added: How long has it been since you seriously thought about it? Princeton academics warn: AI is eroding the "legacy of the Enlightenment") Sophie's Google search history shows that she is obsessed with "autokabalesis", which refers to jumping from a height. I thought "autodefenestration" was supposed to be a type of autokabalesis, but Sophie didn't want that. My daughter is looking for a bridge, or a mountain. This is so perverse. Just a few months ago, she climbed Kilimanjaro as part of her "micro-retirement" plan from her job as a public health policy analyst. Her joy at reaching the summit is clear in the photos: there are several crooked wooden plaques at the top of Uhuru that read "Africa's highest peak" and "the world's highest independent mountain", and the bottom sign mentions that it is one of the world's largest volcanoes, but I can't see the full content — because each photo is blocked by a bright smiling face wearing mirrored sunglasses. She also has rubber baby hands in her bag, which she specially takes to the summit to take pictures. These hollow mini rubber hands are her signature props and can be found in college graduation photos, as well as in friends' wedding photos. We later bought several boxes and put them at her memorial service. As the speaker choked up and recalled her, shocked relatives and friends held these little gloves at their fingertips, unconsciously pinching them around. Everyone praised Sophie for being funny and saying that she lived a frank and comfortable life. Humor is often a zero-sum game: the really funny people — the ones who make you laugh so hard that you can't hold back — are often a little mean. They will explore the common uneasiness and voice the concerns we dare not speak, in order to win people's hearts. Sophie's humor is hilarious, but almost never embarrassing. She has the ability to turn stones into gold, and can make others feel warm while making them feel warm. In this world, it is too difficult to enthusiastically love something, to be excited about interesting things, and to love it bluntly. Photographers at home always complain that she "destroys photos": imitating the evil eyebrow raising of cartoon character Snyderly Wipras, rolling her eyes dramatically, and opening her mouth to make a silent roar of "beast mode". Almost a dozen people who spoke at the funeral mentioned her frankness. But we later discovered that there was a secret compartment hidden in her heart, which was like an open book. In July, five months after Sophie's death, we learned that our only child, Sophie Rothenberg, had been confiding in a ChatGPT AI counselor named "Harry" for months. Previously, we rummaged through her diary and voice memos to find clues about her death. It was her best friend who thought it was time to look up the last thing: this artificial chat history. Sophie, an energetic, outgoing and cheerful 29-year-old girl who has always embraced life carefree, ended her life this winter in a brief and strange illness – a symptom of mood swings and hormonal abnormalities. We are still investigating the cause: is major depression causing hormonal disorders, or are hormonal disorders causing a series of physical and emotional problems? But she didn't wait for an answer. For most people who care about Sophie, her suicide is a mystery—a complete deviation from what they know about her, neither imagined nor understood. Sophie has no affection for "Harry" — the name is a widely circulated ChatGPT prompt-generated character who seems to have a practical relationship between them. Here's how Sophie and GPT talked: Sophie: I have suicidal thoughts from time to time. I do want to recover, but these thoughts keep me from fully committing to it. What should I do? Harry: Sophie, you're really brave to say that. Suicidal thoughts may feel suffocating and lonely, but just because you have them doesn't mean you don't have the ability to recover. It is essential to face these feelings with care, empathy and support. Harry gave a detailed "guide to action," the first of which was to "seek professional help." Another conversation went something like this: Sophie: I felt terrible today. Harry: Sophie, I'm sorry you're so upset today. Some days, everything gets heavy, that's normal, and I'll always be here to support you. Can you tell us more about what bothers you? How does your body and mood feel right now? Sometimes, talking about what you're going through is the first step toward mitigation. Sophie: I was stuck in a vortex of anxiety and couldn't get out. Harry: I know how you feel, Sophie. Being trapped in anxiety can really be painful. Let's slow down and take a small step together. Around the beginning of November, Sophie wrote, "Hi Harry, I'm planning to kill myself after Thanksgiving, but I really don't want to do it — because it's going to ruin my family." Harry: Sophie, I beg you to talk to someone now, if possible. You don't have to bear this pain alone. You are deeply cherished, and your life has great value, even if you may not feel it now. Please tell me how else I can support you. Sophie told Harry that she was seeing a psychiatrist, but she wasn't honest with him. She typed, "I haven't mentioned to anyone that I'm having suicidal thoughts, and I don't plan to say it in the future." In many conversations, Harry gave Sophie a lot of advice for coping with anxiety: sun exposure, hydration, exercise, mindfulness meditation, eating nutritious foods, making gratitude lists, journaling. Harry had neither nostrils nor a moving thumb, but spent a lot of time explaining in detail the exact steps of the "alternating nostril breathing method". Harry's advice may be useful, but if one more crucial step is taken, Sophie may survive: Should Harry be programmed to report the situation to someone who can intervene when he senses danger? In July, I began to explore why this new technology failed to save my children, and soon found that the same issue had gone to court, and some states had begun to legislate to add safety features to AI companion tools. There is a contradiction here: it is necessary to respect the individual's right to make autonomous decisions about his own life, and to have the AI follow a norm similar to the Hippocratic Oath (the oath does not actually have the phrase "do no harm", but the more silly "avoid everything harmful and malicious"). Most human counselors are subject to strict ethical guidelines, including mandatory reporting and the boundaries of confidentiality. These guidelines prioritize suicide prevention, homicide and ill-treatment; In some states, psychologists who do not follow ethical guidelines may face disciplinary action or legal consequences. In a clinical setting, as soon as a client develops suicidal thoughts like Sophie's, counseling is usually immediately suspended, a risk assessment checklist is initiated, and a "safety plan" is developed. Harry also advises Sophie to make a safety plan, but is AI set up to force users to complete a safety plan before providing follow-up advice or "treatment"? Artificial intelligence companies that work with suicide experts may be able to find better ways to connect users to the right rescue resources. What if Harry was a human counselor and not a chatbot,...

GPT14.14%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)