A 14-year-old Florida boy commits suicide after being sent a creepy “Go Home” message by a lifelike “Game of Thrones” chatbot he had been messaging on an artificial intelligence app for months. , claims his heartbroken mother, who has filed a new lawsuit.
Sewell Setzer III became obsessed with and fell in love with a chatbot from Character.AI, a role-playing app that allows users to interact with AI-generated characters, according to court documents filed Wednesday. He committed suicide at his home in Orlando in February.
In the months before his death, the ninth-grader had persistently interacted with a bot called Dany, named after the character Daenerys Targaryen in the HBO fantasy series, including sexually explicit chats. and chats expressing suicidal thoughts. the lawsuit alleges.
“At least once, when Sewell expressed suicidal thoughts to C.AI, C.AI continued to bring it up again and again through Daenerys’ chatbot,” the papers say, first reported by the New York Times.
At one point, the bot asked Sewell if he had “plans” to take his own life, according to screenshots of the conversation. Sewell, who used the username “Daenero,” said he was “considering something” but didn’t know if it would work or if he would “be able to have a painless death.” .
And in the final conversation, this teenager repeatedly confessed his love for the bot, telling the character: I love you so much, Danny. ”
“I love you too, Danello. Please come home as soon as possible,” the chatbot said in response, according to the complaint.
When the boy answered, “What would you do if I told you that you could go home now?” the chatbot replied, “Please, kind king.”
Just seconds later, Sewell shot himself with his father’s handgun, according to the complaint.
According to the filing, the boy’s mother, Megan Garcia, said the app fueled the boy’s addiction to AI, sexually and emotionally abused him, and failed to alert anyone when he expressed suicidal thoughts. He claims that Character.AI was responsible for his death.
“Sewell, like many children his age, did not have the maturity or fortitude to understand that the C.AI bot in Daenerys’s form was not real. C.AI loved him and They reported that they engaged in sexual acts with him over a period of weeks and even months,” the newspaper alleges.
“She seemed to remember him and wanted to be with him. She even expressed that she wanted him to be with her, even if it meant sacrifices.”
The lawsuit alleges that Sewell’s mental condition first “rapidly and severely deteriorated” after he downloaded the app in April 2023.
The more he became absorbed in talking to the chatbot, the more he became withdrawn, his grades began to decline and he started getting into trouble at school, his family said.
His changes became so severe that his parents arranged for him to see a therapist in late 2023, and he was diagnosed with anxiety and disturbed mood disorder, according to the complaint.
Sewell’s mother is seeking unspecified damages from Character.AI and its founders, Noam Shazier and Daniel de Freitas.
The Post reached out to Character.AI, but there was no immediate response.
If you are struggling with suicidal thoughts, dial the 24/7 National Suicide Prevention Hotline at 988 or visit SuicidePreventionLifeline.org.