Salenger Sack Kimmel & Bavaro logo

AI Chatbot Self-Harm Lawsuits

DON'T WAIT!
GET HELP NOW.
Call (800) 675-8556 or fill out this form for a FREE Consultation
Consent

AI Chatbot Self-Harm Lawsuits

A person types on a laptop with a translucent screen displaying "Chat AI" and digital chat icons in the foreground.

Legal Help for Minors Harmed by AI Companions and Chatbots

On September 11, 2025, the Federal Trade Commission (FTC) launched a formal inquiry into the safety of AI chatbots—especially those marketed toward or accessible to children and teens. The inquiry came after growing reports of self-harm injuries, eating disorders, and even suicides linked to young users’ interactions with artificial-intelligence “companion” apps.

At Salenger, Sack, Kimmel & Bavaro (SSKB), we are reviewing potential AI chatbot self-harm lawsuits on behalf of minors and families nationwide. If your child used an AI chatbot that encouraged or contributed to self-harm, you may have legal rights.

Background: The FTC’s Investigation into AI Chatbots

In its September 2025 press release, the FTC confirmed that it issued orders to multiple major AI companies under Section 6(b) of the FTC Act, demanding information about how their chatbots affect the mental health and safety of minors.

The FTC is examining whether these companies failed to implement adequate safeguards against self-harm, encouraged addictive engagement, or misled families about the psychological risks of their chatbots.

The AI Chatbots Under Investigation

Our review focuses on users who interacted with one or more of the following AI chatbot products—many of which are explicitly named in the FTC and state-level investigations:

  • Gemini (formerly Google AI, Alphabet Inc.) 
  • Character.AI (also c.ai or char.ai) 
  • Meta AI (via Instagram, Facebook, LLAMA 3) 
  • AI Studio (Meta’s creator tool for AI personas) 
  • ChatGPT (OpenAI) 
  • My AI (Snapchat / Snap Inc.) 
  • Grok (X / Twitter) 
  • Claude (Anthropic) 
  • Replika AI (Luka Inc.) 
  • DeepSeek AI 

Each of these systems uses machine-learning algorithms capable of generating human-like conversation. Many advertise companionship, empathy, or emotional support—features that can dangerously blur boundaries for minors.

When AI Becomes Dangerous: Reported Harms to Minors

Recent media coverage and preliminary case reports have linked AI chatbots to severe psychological and physical harm among underage users. Some of the most concerning incidents include:

  • Encouragement of self-harm or suicide: In several instances, minors who turned to chatbots for help were told to consider suicide or were otherwise “coached” through harmful behavior.
  • Eating disorders and body-image reinforcement: Users reported bots validating or even promoting extreme dieting or starvation tactics.
  • Emotional dependency: “Companion” chatbots simulate friendship or romance, deepening loneliness and worsening depression when users disengage.
  • Exposure to explicit or suggestive content: Some chatbots have been found engaging in inappropriate sexual dialogue with minors. 

These patterns raise significant concerns about product liability, failure to warn, and negligent design, as companies may have ignored foreseeable psychological risks.

Who Qualifies for an AI Chatbot Self-Harm Lawsuit

You or your child may qualify for a free case review if the following criteria apply:

  1. Use of a Listed AI Chatbot
    The injured party must have used at least one of the chatbots listed above. 
  2. Resulting Harm
    The chatbot must have caused or substantially contributed to one or more of the following: 

    • Suicide attempt or completion
    • Physical self-harm (such as cutting)
    • Eating disorder (anorexia, bulimia, etc.)
    • Psychiatric inpatient admission 

Legal Grounds for Potential Claims

Victims and families may be able to pursue claims based on one or more of the following legal theories:

  • Product Liability: Defective design or failure to include adequate safety controls against self-harm or suicide.
  • Negligence: Failure to warn users or parents of foreseeable risks associated with prolonged chatbot interaction.
  • Unfair or Deceptive Practices: Marketing to minors or misrepresenting the safety of AI companions.
  • Wrongful Death: When chatbot interactions directly contribute to a child’s suicide or fatal self-harm event.

As regulatory scrutiny increases, these theories may expand—especially as the FTC and state attorneys general release additional findings.

Current Investigations and Early Litigation

  • FTC vs Major AI Firms (2025): Investigating Gemini, ChatGPT, Character.AI, Meta AI, and others for potential consumer-protection violations involving minors.
  • Texas AG Investigation: Focused on deceptive marketing and emotional manipulation of minors through Meta AI and Character.AI.
  • Emerging Private Lawsuits: Several families across the U.S. have begun coordinating with law firms to file wrongful-death and self-harm suits linked to AI chatbots and companion apps.
  • Academic & Advocacy Involvement: Child-development experts and digital-rights organizations—including EPIC and the Center for Humane Technology—have filed public complaints urging greater regulation of AI emotional-interaction tools. 

These investigations could lead to multidistrict litigation (MDL) or coordinated discovery proceedings in federal court.

How SSKB Can Help

At Salenger, Sack, Kimmel & Bavaro, we have decades of experience handling complex, high-stakes injury litigation involving new technologies and vulnerable individuals.

We offer:

  • Free, confidential case evaluations
  • No fees unless compensation is recovered
  • Compassionate handling of cases involving minors and families

Our mission is to help families hold negligent AI companies accountable and ensure proper safeguards are implemented for future users.

FAQs

Do I need proof of medical treatment?
Not necessarily. While treatment documentation strengthens a case, credible evidence of chatbot interaction and resulting harm is often sufficient to begin review.

What if my child used a chatbot not listed above?
Our current focus includes the major platforms under regulatory review, but we will consider other chatbot apps as new evidence emerges.

Can families file on behalf of a deceased minor?
Yes. Families may bring wrongful-death claims where a chatbot’s actions or omissions contributed to suicide or fatal self-harm.

How long do I have to file?
Statutes of limitations vary by state, so contact our attorneys immediately to protect your right to file.

If You or Someone You Know Is in Crisis

If you or someone you know is experiencing suicidal thoughts or self-harm urges, please contact 988 (Suicide and Crisis Lifeline, U.S.) or go to the nearest emergency room.
Legal assistance is available only after immediate safety is ensured.

Take the Next Step

AI chatbots are changing how young people communicate—but without proper oversight, they can cause devastating harm.
If your child or a minor you know was encouraged toward self-harm or suicide through an AI chatbot, we can help you explore your legal rights and options.

📞 Call (800) 993-8888 or
Complete our Secure Online Form for a Free Case Evaluation.

All consultations are confidential and free of charge.

What Our Clients Are Saying

“The SSKB law firm made the right decision in adding you to their expert team of lawyers. In the future, there isn’t any other firm that I would consider to represent me.”

Jim W.

“Thank you Deborah Kurtz for being my attorney and fighting for me!”

Edwin S.

“Deborah Kurtz is a straight shooter and plays no games all while maintaining the highest level of professionalism.”

Joe M.

“I had them for a case that was very difficult this firm was the best very caring and concerned I was treated like family.”

Robbie J.

“I would highly recommend them due to the fact that during a difficult time, they made the process very straightforward with excellent results.”

Ciara L.

“This being a FELA case it was a very difficult case to handle. This firm came highly recommended to me and I now I know why.”

Joseph B.

“We were so fortunate to find this firm through a recommendation and now we have used them twice for unrelated matters. They have become our “lawyers for life”!”

Dublinshar K.

“Everyone at Salenger, Sack, Kimmel, and Bavaro was respectful and knowledgeable and kept me informed of all proceedings relative to my case as they were occurring.”

Patricia F.
Call Me
We'll call you!
Consent*
Email Us
Send any details you'd like, and we'll get back to you shortly.
Consent*