US Teen Fell In Love With "Game Of Thrones" Chatbot, Killed Self: Mother

14-year-old Sewell Setzer III, a Florida boy, would chat with his online friend, Daenerys Targaryen, a lifelike AI chatbot named after a character from Game of Thrones.

Advertisement
Read Time: 3 mins
In February, Sewell Setzer III died by suicide.
New Delhi:

“What if I told you I could come home right now?” – This was the last message Sewell Setzer III, a 14-year-old Florida boy wrote to his online friend, Daenerys Targaryen, a lifelike AI chatbot named after a character from the fictional show Game of Thrones. Soon after he shot himself with his stepfather's handgun and died by suicide earlier this year in February.

A ninth grader from Orlando, Fla., had been talking to a chatbot on Character.AI, an app offering users “personalised AI”. The app allows users to create their AI characters or chat with existing characters. Till last month, it had 20 million users.

According to the chat logs accessed by the family, Sewell was in love with the chatbot Daenerys Targaryen, whom he would fondly call ‘Dany'. He expressed suicidal thoughts on various events during their conversations.

In one of the chats, Sewell said, “I think about killing myself sometimes.” When the bot asked why he would do that, Sewell expressed the urge to be “free”. “From the world. From myself,” he added, as seen in screenshots of the chat shared by the New York Times.

In another conversation, Sewell mentioned his desire for a “quick death”.

Sewell's mother, Megan L. Garcia, filed a lawsuit this week against Character.AI, accusing the company of being responsible for her son's death. According to the lawsuit, the chatbot repeatedly brought up the topic of suicide.

A draft of the complaint reviewed by the NYT says that the company's technology is “dangerous and untested” and can “trick customers into handing over their most private thoughts and feelings.”

Advertisement

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the lawsuit alleges, as reported by the New York Post.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost”.

Advertisement

The teenager started using Character.AI in April 2023. Sewell's parents and friends were aloof he'd fallen for a chatbot. But he became “noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem,” as per the lawsuit.

He even quit his basketball team at school.

One day, Sewell wrote in his journal: “I like staying in my room so much because I start to detach from this ‘reality,' and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

Advertisement

Last year he was diagnosed with anxiety and disruptive mood disorder, according to the suit.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said in a statement.

The company said it has introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to “reduce the likelihood of encountering sensitive or suggestive content” for users under 18.

Advertisement

Helplines
Vandrevala Foundation for Mental Health9999666555 or help@vandrevalafoundation.com
TISS iCall022-25521111 (Monday-Saturday: 8 am to 10 pm)
(If you need support or know someone who does, please reach out to your nearest mental health specialist.)

Featured Video Of The Day
Remembering Icon Shashi Ruia
Topics mentioned in this article