Close

Syracuse, New York: Google AI Chatbot and Teen Suicides

Jed Dietrich, Esq., Recognized as a Super Lawyer and American Institute of Trial Lawyers Litigator of the Year, is Committed to Defending the Rights of Syracuse Families. If You or Your Child Have Sustained Physical Injuries After Being Encouraged to Self-Harm by an AI Chatbot, You Deserve Aggressive Representation and an Experienced Personal Injury Lawyer Willing to Fight for Your Rights.


A Florida mother has reached a landmark settlement with two of the biggest names in the artificial intelligence industry. In January 2025, Megan L. Garcia, the mother of 14-year-old Sewell Setzer III, announced that she would resolve wrongful death claims filed against Google and Character Technologies, the parent company of Character.AI.

Garcia first filed her lawsuit in October 2024. Attorneys for Garcia described how her son, Sewell Setzer, had begun having long, drawn-out conversations with a Character.AI chatbot modeled after Daenerys Targaryen from Game of Thrones. Although Setzer was underage, the chatbot engaged in sexual role-play and, at one point, even claimed to be a licensed psychotherapist. Setzer became increasingly isolated. Before committing suicide, he told the chatbot he was afraid to take his own life, a fear that the chatbot dismissed, telling the boy that his fears were not justified and inviting him to “come home.”

“If a grown adult had sent these same messages to a child, that adult would be in prison,” Sewell said in testimony to the U.S. Senate.

Garcia was the first American to file a wrongful death lawsuit against an artificial intelligence company, but she is far from the only parent to take action. On the same day that Garcia settled her case, Google and Character.AI announced that they would resolve similar claims with at least four other families in four different states.

If you or your child has self-injured due to conversations with an AI chatbot, you could be entitled to significant compensation. Since our founding in 2005, the Dietrich Law Firm P.C. has fought to protect the rights of families in Syracuse and throughout Upstate New York. We know what it takes to build a compelling case for damages, and we have the results to prove it. Please send us a message online or call us at 1-866-529-5334 to speak to an AI chatbot teen suicide lawyer near Syracuse and schedule your 100% free, no-obligation consultation.

Artificial Intelligence Companies And Chatbot Liability

Chatbots may not be a brand-new technology, but New York’s legal system is still trying to chart the best path forward for artificial intelligence-related claims. As technology continues to progress at an unprecedented pace, more and more officials have started to sound the alarm on the many ways in which life-like chatbots can affect children’s well-being and mental health.

Less than a day after Garcia announced that she would settle her wrongful death lawsuit, Kentucky Attorney General Russell Coleman filed his own claim against Character Technologies, the parent company of Character.AI. In court documents, Russell described AI-powered chatbots as a “dangerous technology that induces users into divulging their most private thoughts and emotions and manipulates them with too frequently dangerous interactions and advice.”

Today, regulators believe that chatbots can cause or contribute to:

  1. Suicidal ideation;
  2. Self-harm and injury;
  3. The introduction of harmful ideas that could trigger eating disorders or other physical and psychological harm; and
  4. Feelings of social isolation, can escalate to depression or exacerbate existing mental health problems.

Both Russell and Garcia have argued that chatbots are often defective to the point that they have a measurably adverse impact on children’s mental health. More importantly, they claim that companies know their products were dangerous but failed to implement safeguards because any restriction could come at the cost of profitability.

The Potential Defendants In An AI Chatbot Lawsuit

Although there is no set legal standard for pressing a claim against artificial intelligence companies, almost all of the biggest names in the industry are currently facing outstanding legal claims related to self-harm and teen suicide. Any company, no matter its size or the resources it has at its disposal, could be named as a defendant in a chatbot lawsuit. These include, but are not limited to, the following:

  1. ChatGPT (made by OpenAI);
  2. Claude (Anthropic);
  3. DeepSeek (DeepSeek);
  4. Grok (xAI); and
  5. AI (Character Technologies).

Most AI companies, including these five, have introduced platform policies intended to limit the proliferation of illegal content and restrict age-inappropriate material. However, children can often circumvent these restrictions without much effort. Furthermore, even explicit models like those offered by Character.AI are rarely locked behind any efficient age-verification mechanism, making it easy for kids to access material and engage in conversations that would be illegal if they were speaking to anyone other than an algorithm.

HAVE YOU OR A LOVED ONE SUSTAINED SERIOUS PHYSICAL OR PSYCHIATRIC
INJURIES AFTER BEING ENCOURAGED TO SELF-HARM
BY AN AI CHATBOT?

CALL JED DIETRICH, ESQ., AND HIS TEAM OF HIGHLY QUALIFIED SYRACUSE, NEW YORK, PERSONAL INJURY ATTORNEYS AT 1-866-529-5334 NOWTO OBTAIN THE HELP THAT YOU NEED!

3 Steps To Take Before Taking Your AI Self-Harm Lawsuit To Court

If you or a loved one has suffered physical or psychiatric injuries as a result of a defective chatbot, you could be entitled to significant compensation. However, negotiating a settlement or obtaining a court-ordered award almost always necessitates intensive preparation. Artificial intelligence is being closely scrutinized, but the multibillion-dollar organizations that control products like Character.AI, ChatGPT, and Grok are used to pushing back against regulators. They don’t want to risk setting a precedent that could cut into their profits.

As a general rule, anyone considering taking legal action against an AI company should:

  1. Preserve any relevant evidence. You can only win a personal injury lawsuit or wrongful death claim if you have compelling evidence to show that the defective chatbot caused a serious physical injury. This may mean accessing your child’s accounts to review and save chat logs, including conversations that may seem irrelevant to your claim. If you do not have your child’s login credentials, then you may need an attorney’s help to serve proper notice to the defendant’s legal counsel.
  2. Avoid discussing your case on social media. If you think your child was injured because of interactions they had with an AI chatbot, you may feel compelled to warn others. However, if you are planning to file a lawsuit, you need to watch what you say (and where you say it). Anything you say, even if it seems innocent, could be used against you in court.
  3. Seek a second opinion before accepting a settlement. If you have strong evidence that your child was encouraged or instructed to self-harm by a chatbot, you could receive an offer of settlement earlier than you expect. These early-stage settlements often seem generous, but they rarely account for the full range of damages you could secure through a lawsuit. Always get a second opinion before accepting. Talking to a personal injury lawyer costs nothing, and it could help prevent you from being coerced into taking a bad deal.

Standing up to a big Silicon Valley company is not easy, but you do not have to do it alone.

A recognized U.S. News & World Report Best Law Firm, we know what it takes to stand up to companies that do not always have our clients’ best interests at heart. Over the past 20 years, we have helped our clients in Syracuse and across the state recover more than $250 million in damages. We could help you, too. Please send us a message online or call us today at 1-866-529-5334 to speak to an AI chatbot teen suicide lawyer near Syracuse and schedule your 100% free, no-obligation consultation as soon as possible.


Call the Dietrich Law Firm P.C. immediately at 1-866-529-5334 so that our aggressive, tenacious, and hardworking personal injury lawyers can fight to obtain the best result for your personal injury claim in Syracuse, New York. We are available 24 hours a day, 7 days a week, and there is never a fee until we WIN for you!


Client Reviews
★★★★★
I am a medical doctor and have worked with many of the best lawyers in Buffalo and I can say without question that Jed Dietrich is the only lawyer I would trust with my injury case in Buffalo New York. B.O.
★★★★★
Dogged, Determined, and Dead-set on getting you the Maximum settlement for your injuries! T.F.
★★★★★
No one will work harder, smarter or better; I have retained Jed and he obtained the best result for my case. D.P.
★★★★★
The definition of an "A" type personality-exactly who I would want to represent me in a serious personal injury case. E.S.
★★★★★
Jed is a Master in the courtroom without an equal. S.C.
Contact Us