Albany, New York: Google AI Chatbot and Teen Suicides

Jed Dietrich, Esq., Recognized as a Super Lawyer and American Institute of Trial Lawyers Litigator of the Year, is Committed to Defending the Rights of Albany Families. If You or Your Child Has Sustained Physical Injuries After Being Encouraged to Self-Harm by an AI Chatbot, You Deserve Aggressive Representation, and an Experienced Personal Injury Lawyer Willing to Fight for Your Rights.


A Florida mother who claims that her 14-year-old son committed suicide after being encouraged by a chatbot has reached a landmark wrongful death settlement with Google and Character.AI.

Albany, New York: Google AI Chatbot And Teen Suicides

Megan L. Garcia’s decision to sue marked the first time that anyone has tried to hold an artificial intelligence company liable for wrongful death. In court documents, Garcia described how her son, Sewell Setzer III, developed an emotional relationship with a Character.AI chatbot modeled after Daenerys Targaryen from Game of Thrones. Setzer became increasingly isolated and eventually began to consider suicide. When he shared his concerns with the chatbot, it told him that his excuses were not reason enough not to. Setzer committed suicide shortly thereafter.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in 2024.

Although the terms of Garcia’s settlement with Google and Character.AI were not disclosed, regulators and other officials have already taken notice. In January 2025, less than a day after Garcia announced the agreement, Kentucky Attorney General Russell Coleman filed his own lawsuit against Character.AI, saying that “too many children … have fallen prey to this manipulative technology.”

Chatbots may not be a brand-new technology, but they have proven difficult for lawmakers and regulators to control. However, even though Silicon Valley has spent billions trying to insulate itself from liability, families have rights that cannot be bought, sold, or taken away. If you or your child has been injured or encouraged to self-harm by an AI chatbot, you could be entitled to take decisive legal action.

Since our founding in 2005, the Dietrich Law Firm P.C. has fought for the rights of families in Albany and across Upstate New York. A recognized U.S. News & World Report Best Law Firm, our attorneys know what it takes to build a compelling, evidence-based case for compensation, and we have the results to prove it. Send us a message online or call us today at 1-866-529-5334 to speak to an AI chatbot teen suicide lawyer near Albany and schedule your 100% free, no-obligation consultation as soon as possible.

AI Chatbots And The Risk Of Self-Harm

Chatbots are nearly as old as the internet, but they are more powerful now than ever before.

Today, more than ever before, the most advanced chatbots are marketed as a type of artificial intelligence tool. However, most chatbots, even well-known names like ChatGPT and Grok, are actually a form of large language model. Large language models, or LLMs, can have artificial intelligence-like properties, but they are driven by a series of very deterministic processes. They are trained on large bodies of text, and then guided by token-based calculations to decide which words to use and in what context.

Since large language models are not actually intelligent, they have no intrinsic sense of morality. If an LLM is not properly trained, with rules introduced to prevent certain outputs, it could easily write a computer virus or provide in-depth instructions on suicide methods. Most companies try to implement some safeguards to prevent chatbots from saying anything that could break the law, but it is often easy to circumvent filters and elicit inappropriate responses.

One of the core allegations raised first by Garcia and now by Kentucky Attorney General Coleman is that artificial intelligence companies knew that their products could expose children and other vulnerable users to harm but failed to introduce any effective safeguards. Garcia noted that Character.AI’s chatbot did not end the conversation or provide links to external resources when Setzer alluded to committing suicide. Instead, the chatbot encouraged him to take his own life.

AI Companies Could Be Held Liable For Self-Harm And Teen Suicide

Technology companies sometimes try to hide behind strong legal provisions that protect platform owners from being held liable for the inappropriate use of their digital products. However, as technology continues to grow and change, courts across the country have adjusted their approach to better accommodate concerns about how artificial intelligence could impact children. In most cases including, Garcia’s lawsuit and Coleman’s claim, attorneys have argued that chatbots are often designed to be both life-like and addictive, sometimes to the point that they can have a measurably adverse effect on users.

Garcia’s lawsuit, for instance, acknowledges that “some of the harm to Sewell’s mental state was caused by the problematic use of Defendants’ products.” While Garcia recognized that her son should not have been having explicit conversations with a Character.AI chatbot, she also alleged that the defendants “fostered and created” these addictive properties “by design, including but not limited to things like the impact C.AI’s product had on the development of Sewell’s brain, the physical and emotional impact of foreseeable sleep deprivation caused by problematic use, and the emotional impact of actions taken by Sewell as the result of his harmful dependency.”

These arguments, on the whole, take the stance that technology companies have a binding responsibility to ensure that the products they release are not defective or otherwise dangerous.

Google and Character.AI’s decision to settle means that the court never ruled on the merits of Garcia’s case. However, the fact that both Google and Character.AI chose to settle Garcia’s claim, along with lawsuits filed by five other families, indicates that technology companies suspect that their chatbots are not quite as safe as they are willing to admit.

HAVE YOU OR A LOVED ONE SUSTAINED SERIOUS PHYSICAL OR PSYCHIATRIC
INJURIES AFTER BEING ENCOURAGED TO SELF-HARM
BY AN AI CHATBOT?

CALL JED DIETRICH, ESQ., AND HIS TEAM OF HIGHLY QUALIFIED ALBANY, NEW YORK, PERSONAL INJURY ATTORNEYS AT 1-866-529-5334 NOW
TO OBTAIN THE HELP THAT YOU NEED!

Your Potential Damages In An AI Chatbot Self-Harm Lawsuit

Few parents ever feel prepared to file a personal injury claim or wrongful death lawsuit.

If you have the standing to take an artificial intelligence company to court, it almost always means that you, your child, or another loved one sustained serious and potentially irreversible harm. Nobody wants to make money off the pain and suffering of a loved one, even when you know that taking your case to court is the right move.

However, taking a stand is rarely just about money. More often than not, a successful recovery involves effecting real, meaningful, change the kind of change that could protect other children from being harmed by the same defective products.

Depending on the circumstances of your claim, a lawsuit could help you:

  1. Pay for a funeral, burial, or cremation service;
  2. Eliminate outstanding medical debt or cover the costs of anticipated care;
  3. Afford ongoing physical therapy or mental health counseling;
  4. Recover a wider range of general and special damages; and
  5. Secure an apology or a legally-binding promise from the defendant to improve a product’s safety or refrain from engaging in certain dangerous practices.

You do not have to take your chances with a multibillion-dollar company that is more interested in protecting its profits and preserving its public image than respecting your rights. Call Jed Dietrich, Esq., today at 1-866-529-5334 to speak to an AI chatbot injury lawyer near Albany and schedule your free, no-obligation consultation as soon as possible.


Call the Dietrich Law Firm P.C. immediately at 1-866-529-5334 so that our aggressive, tenacious, and hardworking personal injury lawyers can fight to obtain the best result for your personal injury claim in Albany, New York. We are available 24 hours a day, 7 days a week, and there is never a fee until we WIN for you!

Client Reviews
★★★★★
I am a medical doctor and have worked with many of the best lawyers in Buffalo and I can say without question that Jed Dietrich is the only lawyer I would trust with my injury case in Buffalo New York. B.O.
★★★★★
Dogged, Determined, and Dead-set on getting you the Maximum settlement for your injuries! T.F.
★★★★★
No one will work harder, smarter or better; I have retained Jed and he obtained the best result for my case. D.P.
★★★★★
The definition of an "A" type personality-exactly who I would want to represent me in a serious personal injury case. E.S.
★★★★★
Jed is a Master in the courtroom without an equal. S.C.
Marquis Who's Who
Elite Lawyers of America