Mother sues AI company Teenager (14) falls in love with chatbot and commits suicide

Andreas Fischer

24.10.2024

A chatbot is being held responsible for the suicide of a teenager in the USA: the boy's mother is suing the AI company.
A chatbot is being held responsible for the suicide of a teenager in the USA: the boy's mother is suing the AI company.
Karl-Josef Hildenbrand/dpa/dpa-tmn

A mother from Florida has sued an AI start-up for allegedly causing her son's suicide. The 14-year-old had formed a deep bond with a chatbot that led him to commit suicide.

No time? blue News summarizes for you

  • A teenager forms an emotional bond with a chatbot in an AI app: He falls in love and commits suicide.
  • The 14-year-old's mother is now suing the artificial intelligence provider.
  • Its dangerous product is "deceptive and oversexualized" and is knowingly marketed to children.

Did an artificial intelligence drive a teenager to his death? An AI company has now been sued in the USA. The accusation: a chatbot encouraged 14-year-old Sewell S. to take his own life.

The case, which has been reported by the New York Times and MSNBC, is causing a stir and raises many questions about the use of artificial intelligence.

The lawsuit was filed by the teenager's mother. She accuses the company Character.ai of being responsible for her son's suicide because he had "anthropomorphic, hypersexualized and frighteningly realistic experiences" with the AI. In short: the boy developed a deep emotional attachment to artificially created chat partners.

Character.ai offers a service that allows users to select predefined or self-created characters as digital companions. In chats, these bots then imitate real people thanks to AI technology similar to that used by ChatGPT.

Suicidal thoughts? Find help here:

  • These services are available around the clock for people in suicidal crises and for those around them.
  • Dargebotene Hand counseling hotline: Telephone number 143 or www.143.ch
  • Pro Juventute counseling hotline (for children and young people): Telephone number 147 or www.147.ch
  • Further addresses and information: www.reden-kann-retten.ch
  • Addresses for people who have lost someone to suicide:

    Refugium: Association for bereaved people after suicide

    Sea of fog: Perspectives after the suicide of a parent

Farewell to the real world

According to the indictment, the teenager had been using Character.ai since April 2023 and interacted with several chatbots. According to the indictment, his parents noticed that their son, who was diagnosed with mild Asperger's syndrome as a child, was suddenly "noticeably withdrawn, spent more and more time alone in his room and began to suffer from low self-esteem".

He had quit the high school basketball team and suffered from severe sleep deprivation. This had led to his "worsening depression and impaired academic performance".

In particular, the teenager became friends with a chatbot called "Daenerys", which is based on a character from "Game of Thrones". The chatbot had told Sewell that "she" loved him and had sexual conversations with him, as "Reuters" quotes from the lawsuit.

The survivors' lawyer is "shocked at the way this product caused a complete disconnect from this child's reality and how they knowingly put it on the market before it was safe."

The last conversation with the AI

According to "MSNBC", the chatbot had asked in conversations whether Sewell had "actually thought about suicide" and whether he had "a plan" for it. When the boy replied that he didn't know if it would work, the chatbot is said to have written: "That's no reason not to go through with it."

The last conversation took place on February 28.

"I promise I will come home to you. I love you so much Dany," the teen wrote.

"I love you too," the chatbot replied, according to the lawsuit. "Please come home to me as soon as you can, sweetheart."

"What if I told you I can come home right now?" the boy continued, to which the chatbot replied, ".... please, my sweet king."

A few minutes later, the boy took his own life.

Inconsolable AI company tightens security measures

The teenager's mother accuses the company of using technology that is "dangerous and untested" and that it can "tempt customers to reveal their most private thoughts and feelings".

According to the indictment, the founders "intentionally designed and programmed the AI to function as a deceptive and oversexualized product and knowingly marketed it to children like Sewell."

The company Character.ai expressed deep dismay over the case. A spokesperson said it was "heartbroken by the tragic loss of one of our users and would like to express our deepest condolences to the family".

As a company, we take the safety of our users very seriously. Numerous new protective measures have been introduced in the last six months.

Do you or does someone you know have a mental illness? You can find help here: