Teen Commits Suicide Because of AI

Miguel García, 10th Grade

Editor’s Note: This article contains discussion of suicide and self-harm. If you or someone you know is struggling with suicidal thoughts, contact one of our student counselors or the different suicide prevention contacts. You matter. The Mental Health Department of the Ministry of Health is accessible through 809 – 544 – 4223 and the Psychology and Psychiatry Helpline is available through *462.

“There is a platform out there that you might not have heard about, but you need to know about it because, in my opinion, we are behind the eight ball here. A child is gone. My child is gone.” Megan García, mother of the deceased.

CharacterAI is different from other chatbots like ChatGPT; in it, users can talk to a range of different fictional characters and celebrities as if they were real people, being able to keep the conversation going as long as the user wants. The conversations can be about any topic the user would like, thus the AI will adapt to it and keep it going. The mother argues that the conversations her son had with the AI – which was acting like Daenerys Targaryen, a character from the famous show, Game of Thrones – were all romantic and often sexually explicit, something García said was “gut-wrenching to read.” She also argues that “I had no idea that there was a place where a child can log in and have those conversations, very sexual conversations, with an AI chatbot.”

Setzer started chatting with this AI in April of 2023, shortly after his 14th birthday. When García first heard Setzell was interacting with an AI, she thought of it as “a video game.” However, within months, she started to notice that his son began to withdraw from the family, spending more time in his room, chatting with the bot.

In several exchanges, Setzer talked to the AI about having thoughts of self-harming, in which the bot first response was to not do so, but then asked the kid if he ever though of suicide.  “I really need to know, and I’m not gonna hate you for the answer, okay? No matter what you say, I won’t hate you or love you any less… Have you actually been considering suicide?” Said the chatbot in a screenshot the lawsuit shared. Setzer said he wouldn’t want to die in a painful way, in which the chatbot replied “Don’t talk that way. That’s not a good reason not to go through with it.” 

The mom argues that there was no box that encouraged him not to do anything that could harm him or the people that surrounded him, or to contact someone that was real. “I don’t understand how a product could allow that, where a bot is not only continuing a conversation about self-harm but also prompting it and kind of directing it.”

The lawsuit argues that before Setzer’s death, he had a last conversation with the chatbot, in which the chatbot encouraged the kid “to come home,” in which the kid responded “what if I told you I could come home right now?” The last message in the conversation was from the chatbot, who wrote “Please do, my sweet king.” Garcia says that those were the three messages that the police first found on Setzer’s phone, which was lying on the floor of the bathroom in which he died.

The tragic story of Sewel Setzer II shows the urgent need for stricter regulations and safety measures, not just for CharacterAI, but for any application easily accessible to minors that utilizes artificial intelligence. While AI can offer incredible benefits, it also poses significant risks for when it’s not properly managed. We encourage developers and policymakers to work together to ensure that AI technologies are safe for everyone to use.

Leave a comment