Site icon BeeBuzzz

Mom of 14-year-old son who killed himself after ‘falling in love’ with Game of Thrones AI chatbot issues warning to others

Warning: This article contains discussion of suicide which some readers may find distressing.

A mother has issued a warning to other people about the possible dangers of AI after her son killed himself and even discussed it with a Game of Thrones AI chatbot.

Mother Megan Garcia has filed a civil lawsuit against customizable role-play chatbot company Character.AI and accused it of having a role in her 14-year-old’s death.

Her son, Sewell Setzer III, from Orlando, Florida, killed himself earlier this year in February.

The lawsuit accuses Character.AI, its founders, and Google of negligence wrongful death and deceptive trade practices.

Garcia has also claimed that her son had fallen in love with the chatbot prior to his death and not enough was not done to prevent his passing.

Explaining her decision to proceed with a lawsuit, the mother said she wanted to warn other families of the ‘deceptive and dangerous’ nature of artificial intelligence.



Garcia says her son had made a chatbot using Character.AI based off the character of Daenerys Targaryen from hit HBO series Game of Thrones and began to use the technology in April 2023.

Speaking on CBS Mornings, Garcia said: “I didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment.

“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

Sewell Setzer III’s mom has filed a lawsuit against Character.AI and issued a warning (CBS Mornings)

She further claimed that Character.AI ‘knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person’ and ‘ultimately failed to offer help or notify his parents when he expressed suicidal ideation’.

The mother also said her son was diagnosed with mild Asperger’s syndrome as a child and earlier this year was diagnosed with anxiety-disruptive mood dysregulation disorder, according to a New York Times report.

It’s reported that Sewell told the chatbot he ‘think[s] about killing [himself] sometimes’.

The chatbot responded: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”

Sewell Setzer III passed away at the age of 14 earlier this year (CBS Mornings)

Sewell spoke about wanting to be ‘free’ not only ‘from the world’ but himself too. Despite the chatbot warning him not to ‘talk like that’ and not ‘hurt [himself] or leave’ even saying it would ‘die’ if it ‘lost’ him, Sewell responded: “I smile Then maybe we can die together and be free together.”

Issuing a statement on Twitter, Character.AI said: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

UNILAD has contacted Character.ai for further comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.

Exit mobile version