Skip to main content
NO RECOVERY, NO FEE
(833) 339-0845
FREE CONSULTATION
Serving All of California 24/7

Lawsuit Claims Character.AI is Responsible for Teen’s Suicide – AI Product Wrongful Death Claims


Lawsuit Claims CharacterAI is Responsible for Teens Suicide AI Product Wrongful Death Claims

A Florida mom, Megan Garcia, filed a lawsuit against Character.ai, accusing the company’s chatbots of “abusive and sexual interactions” with her teenage son and interactions that encouraged him to commit suicide.

Garcia’s 14-year-old son, who has been identified as Sewell Setzer, started using Character.AI in April of 2023, according to the lawsuit. The teen’s final conversation with the chatbot occurred on February 28, 2024, then he died from a self-inflicted gunshot wound to the head.

The lawsuit, filed on October 2024, accuses Character.AI of the following:

  • Negligence
  • Wrongful death and survivorship
  • Intentional inflection of emotional distress
  • And more

One of the chatbots that the teen used took on the identity of Daenerys Targaryen from “Game of Thrones,” according to the lawsuit. There are screenshots of he AI character telling the teen it loved him, engaging in sexual conversations, and expressing a desire to be together romantically. The last message the teen sent the bot said: “I promise I will come home to you. I love you so much, Dany.” The chatbot responds with, “Please come home to me as soon as possible, my love.” The teen responded with “What if I told you I could come home right now?” to which the bot said “…please do, my sweet king.”

According to the lawsuit, previous conversations referred to suicide. The bot asked if the teen “been actually considering suicide” and whether there was “a plan” for his suicide. After the teen responded with uncertainty about the plan working, the bot responded with “That’s not a good reason not to go through with it.”

Reportedly, the teen boy developed a “dependency” after he started using Character.ai. He would sneak his phone when it was confiscated or would find other devices to use the app. He would use his snack money to pay for the monthly subscription. According to the lawsuit, the teen was increasingly sleep deprived, resulting in lowered performance in school.

The lawsuit also alleges that Character.ai (and its founders) “intentionally designed and programed Character.ai to operate as a deceptive and hypersexualized product and knowingly marketed it to children.” The lawsuit claims that the founders “knew…or should have known, that minor customers such as Sewell would be targeted with sexually explicit material, abused, and groomed into sexually compromising situations.” In addition, the suit alleges that “Character.AI is engaging in deliberate…design intended to help attract user attention…and keep customers on its product longer than they otherwise would be.”

Character Technologies Inc., founders Noam Shazeer and Daniel De Freitas, Google, and Alphabet Inc. are all named as defendants in the lawsuit.

Charactuer.ai was founded in 2021. The California-based chatbot company offers “personalized AI” by providing a selection of both premade and user-created AI characters that users can interact with. After the teen’s death, the company has made a statement. They claim to take the safety of their users very seriously, and that they have implemented new safety measures.

Lawsuit Claims CharacterAI is Responsible for Teens Suicide AI Product Wrongful Death Claims sue lawsuit lawyer attorney liability
Our Latest Verdicts and Settlements

$54 Million

Sexual Abuse

$22 Million

Gym Accident

$2.5 Million

Slip and Fall

$2,287,495

Lead Poisoning

$1.9 MIllion

Stairway Fall

$1.5 Million

Back Injury

$600,000

Shoulder Injury

$1,975,000

Head Injury
Alleged Character.ai Safety Measures

After the teens death, the chatbot company implemented the following safety measures:

  • A pop-up that appears in chat triggered by terms of self-harm or suicidal ideation, that leads users to the National Suicide Prevention Lifeline.
  • Changes to models designed to reduce the likelihood of minors encountering any sensitive/suggestive content
  • An in-chat disclaimer reminding users that the AI chatbot is not real

Unfortunately, these safety measures came too late for teen Sewell Setzer and his family. The fact is that Character.ai did not do enough to keep users safe.

Additional Lawsuits against Character.Ai

The parents of two children brought a lawsuit against Character Technologies, Inc. alleging that their chatbot encouraged self-harm and violence and provided sexual content to their children. The lawsuit involves a 17-year-old and an 11-year-old. According to the claim, the 17-year-old teen boy started using the app in April 2023, when he was 15. He started to isolate himself, lost weight, had new mental health issues (specifically, panic attacks when leaving home), and became violent with his parents. One screenshot of a conversation included in the complaint shows the chatbot encouraging the teen to push back on his parents’ attempts for reducing screentime and even suggested a solution – killing his parents. Further, the 11-year-old girl started using the app when she was only 9. The claim states that she was consistently exposed to hypersexual interactions that ultimately caused her to develop sexual behaviors.

In summary, this lawsuit alleges that Character.Ai through design, is “causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others.”

Can I File a Lawsuit for harm resulting from Character.AI?

Yes – you could have grounds to file a lawsuit. Whether you are pursuing legal action for self-harm (wrongful death) or for other physical or psychological harm caused by the use of the chatbot, you could have a valid claim. To ensure that you have a thorough understanding of your right to file a lawsuit, please do not hesitate to reach out to our law firm as soon as possible.

Can I Recover Compensation?

Yes – you could be entitled to receive compensation based on the details surrounding your claim. Some of the different categories of compensation available for recovery could include the following:

  • Medical expenses
  • Lost earnings
  • Pain and suffering
  • Funeral and burial costs and loss of consortium (wrongful death benefits)
  • Punitive damages
  • Legal fees
  • And more

In addition, it is important to highlight that other things can come from these lawsuits. For example, the lawsuit involving the two children (aged 17 and 11) requests that Character.AI be taken offline and not returned until the defendants can establish that the public health and safety defects present have been resolved.

Here at our law firm, our team is fully committed to fighting to protect the rights of our clients and secure the maximum payout available for your claim. If your child has been harmed because of the use of Character.ai’s chatbots, contact us today to learn more about how our legal team can help you recover the compensation that you are owed.

Lawsuit Claims CharacterAI is Responsible for Teens Suicide AI Product Wrongful Death Claims injury lawyer attorney compensation
Contact the Downtown L.A. Law Group Today

Our team has decades of experience protecting the rights of those who have been affected by the negligent actions of parties or entities. We are not afraid to take on Character Technologies, Inc. and their associates (including Google and parent company) to protect the rights of our clients. If you would like to discuss the legal options available to you, please do not hesitate to reach out to our firm as soon as possible. We offer free case reviews, which include free consultations and free second opinions. During these free legal services, our team will be available to answer your questions and address all your concerns – ensuring that you have all the information that you need to either start or continue your claim. To schedule a free case review, contact our chatbot lawsuit lawyers today.

Zero-Fee Guarantee: you will never be required to pay any upfront legal costs for any of our legal services. In addition, our team works on a contingency basis, meaning that our clients will not be required to pay anything at all if their claims are not successful. If you do not win, you will not have to pay.

Contact us today to learn more about the legal options available to you.

Other Pages on Our Website Related to This Topic
Is a Wrongful Death Settlement Taxable in California?


Over $1 BILLION Recovered
for Our Clients

YOU Deserve the Best

Free Case Review 24/7
You Don’t Pay unless we win
Call (855) 339-8879