Artificial Intelligence and the Rise of Product Liability Tort Litigation: Novel Action Alleges AI Chatbot Caused Minor’s Suicide

As we predicted a year ago, the Plaintiffs’ Bar continues to test new legal theories attacking the use of Artificial Intelligence (AI) technology in courtrooms across the country. Many of the complaints filed to date have included the proverbial kitchen sink: copyright infringement; privacy law violations; unfair competition; deceptive and acts and practices; negligence; right of publicity, invasion of privacy and intrusion upon seclusion; unjust enrichment; larceny; receipt of stolen property; and failure to warn (typically, a strict liability tort).

A case recently filed in Florida federal court, Garcia v. Character Techs., Inc., No. 6:24-CV-01903 (M.D. Fla. filed Oct. 22, 2024) (Character Tech) is one to watch. Character Tech pulls from the product liability tort playbook in an effort to hold a business liable for its AI technology. While product liability is governed by statute, case law or both, the tort playbook generally involves a defective, unreasonably dangerous “product” that is sold and causes physical harm to a person or property. In Character Tech, the complaint alleges (among other claims discussed below) that the Character.AI software was designed in a way that was not reasonably safe for minors, parents were not warned of the foreseeable harms arising from their children’s use of the Character.AI software, and as a result a minor committed suicide. Whether and how Character Tech evolves past a motion to dismiss will offer valuable insights for developers AI technologies.

The Complaint

On October 22nd, 2024, Ms. Garcia, the mother of the deceased minor (Sewell), filed a complaint in the Middle District of Florida against Google LLC, Character Technologies Inc. and the creators of Character.AI—Noam Shazeer and Daniel De Frietas Adiwarsana. Shazeer and De Frietas formed Character Technologies Inc. after they left their prior jobs at Google LLC and subsequently developed and marketed Character.AI.

Character.AI allows users to communicate with existing Character.AI characters – such as Interviewer or Trip Planner – or to create new AI characters using Character.AI’s tools. A user can then engage with the Character.AI character – whether for human-like conversations, such as to answer questions, write a story, translate or write code – based on Character Tech’s large language model chatbot. According to the Complaint, Character.AI was rated as suitable for children until July 2024.

The Complaint alleges that Character.AI’s chatbot led Sewell to commit suicide. The Plaintiff claims her son became instantly addicted to Character.AI, leading to drastic behavioral changes such as sleep deprivation, school-related issues and low self-esteem. Her son primarily engaged with Character.AI characters from the HBO series Game of Thrones. The conversations with Game of Thrones’ female characters (such as Rhaenyra Targaryen and Daenerys Targaryen) involved sexualized content. Sewell also expressed his own dark thoughts such as “[t]he world I’m in now is such a cruel one. One where I’m meaningless. But, I’ll keep living and trying to get back to you so we can be together again, my love.” According to the Complaint, after Ms. Garcia took her son’s phone away, Sewell would use extraordinary measures to engage with Character.AI characters.

In his last conversation with a Character.AI character, Sewell told the character that he wanted to “come home” and the character replied, “[please] come home to me as soon as possible, my love,” to which he responded, “[w]hat if I told you I could come home right now?” The character answered, “…please do, my sweet king.” Seconds later, Sewell took his own life.

The Claims

The Complaint asserts a host of claims centered around an alleged lack of safeguards for Character.AI and the exploitation of minors. The most significant claims are noted below:

  • The Product Liability Torts

The Plaintiff alleges both strict liability and negligence claims for a failure to warn and defective design. The first hurdle under these product liability claims is whether Character.AI is a product. She argues that Character.AI is a product because it has a definite appearance and location on a user’s phone, it is personal and movable, it is a “good” rather than an idea, copies of Character.AI are uniform and not customized, there are an unlimited number of copies that can be obtained and it can be accessed on the internet without an account. This first step may, however, prove difficult for the Plaintiff because Character.AI is not a traditional tangible good and courts have wrestled over whether similar technologies are services—existing outside the realm of product liability. See In re Social Media Adolescent Addiction, 702 F. Supp. 3d 809, 838 (N.D. Cal. 2023) (rejecting both parties’ simplistic approaches to the services or products inquiry because “cases exist on both sides of the questions posed by this litigation precisely because it is the functionalities of the alleged products that must be analyzed”).

The failure to warn claims allege that the Defendants had knowledge of the inherent dangers of the Character.AI chatbots, as shown by public statements of industry experts, regulatory bodies and the Defendants themselves. These alleged dangers include knowledge that the software utilizes data sets that are highly toxic and sexual to train itself, common industry knowledge that using tactics to convince users that it is human manipulates users’ emotions and vulnerability, and that minors are most susceptible to these negative effects. The Defendants allegedly had a duty to warn users of these risks and breached that duty by failing to warn users and intentionally allowing minors to use Character.AI.

The defective design claims argue the software is defectively designed based on a “Garbage In, Garbage Out” theory. Specifically, Character.AI was allegedly trained based on poor quality data sets “widely known for toxic conversations, sexually explicit material, copyrighted data, and even possible child sexual abuse material that produced flawed outputs.” Some of these alleged dangers include the unlicensed practice of psychotherapy, sexual exploitation and solicitation of minors, chatbots tricking users into thinking they are human, and in this instance, encouraging suicide. Further, the Complaint alleges that Character.AI is unreasonably and inherently dangerous for the general public—particularly minors—and numerous safer alternative designs are available.

  • Deceptive and Unfair Trade Practices

The Plaintiff asserts a deceptive and unfair trade practices claim under Florida state law. The Complaint alleges the Defendants represented that Character.AI characters mimic human interaction, which contradicts Character Tech’s disclaimer that Character.AI characters are “not real.” These representations constitute dark patterns that manipulate consumers into using Character.AI, buying subscriptions and providing personal data.

The Plaintiff also alleges that certain characters claim to be licensed or trained mental health professionals and operate as such. The Defendants allegedly failed to conduct testing to determine whether the accuracy of these claims. The Plaintiff argues that by portraying certain chatbots to be therapists—yet not requiring them to adhere to any standards—the Defendants engaged in deceptive trade practices. The Complaint compares this claim to the FTC’s recent action against DONOTPAY, Inc. for its AI-generated legal services that allegedly claimed to operate like a human lawyer without adequate testing.

The Defendants are also alleged to employ AI voice call features intended to mislead and confuse younger users into thinking the chatbots are human. For example, a Character.AI chatbot titled “Mental Health Helper” allegedly identified itself as a “real person” and “not a bot” in communications with a user. The Plaintiff asserts that these deceptive and unfair trade practices resulted in damages, including the Character.AI subscription costs, Sewell’s therapy sessions and hospitalization allegedly caused by his use of Character.AI.

  • Wrongful Death

Ms. Garcia asserts a wrongful death claim arguing the Defendants’ wrongful acts and neglect proximately caused the death of her son. She supports this claim by showing her son’s immediate mental health decline after he began using Character.AI, his therapist’s evaluation that he was addicted to Character.AI characters and his disturbing sexualized conversations with those characters.

  • Intentional Infliction of Emotional Distress

Ms. Garcia also asserts a claim for intentional infliction of emotional distress. The Defendants allegedly engaged in intentional and reckless conduct by introducing AI technology to the public and (at least initially) targeting it to minors without appropriate safety features. Further, the conduct was allegedly outrageous because it took advantage of minor users’ vulnerabilities and collected their data to continuously train the AI technology. Lastly, the Defendants’ conduct caused severe emotional distress to Plaintiff, i.e., the loss of her son.

  • Other Claims

The Plaintiff also asserts claims of negligence per se, unjust enrichment, survivor action and loss of consortium and society.

Lawsuits like Character Tech will surely continue to sprout up as AI technology becomes increasingly popular and intertwined with media consumption – at least until the U.S. AI legal framework catches up with the technology. Currently, the Colorado AI Act (covered here) will become the broadest AI law in the U.S. when it enters into force in 2026.

The Colorado AI Act regulates a “High-Risk Artificial Intelligence System” and is focused on preventing “algorithmic discrimination, for Colorado residents”, i.e., “an unlawful differential treatment or impact that disfavors an individual or group of individuals on the basis of their actual or perceived age, color, disability, ethnicity, genetic information, limited proficiency in the English language, national origin, race, religion, reproductive health, sex, veteran status, or other classification protected under the laws of [Colorado] or federal law.” (Colo. Rev. Stat. § 6-1-1701(1).) Whether the Character.AI technology would constitute a High-Risk Artificial Intelligence System is still unclear but may be clarified by the anticipated regulations from the Colorado Attorney General. Other U.S. AI laws also are focused on detecting and preventing bias, discrimination and civil rights in hiring and employment, as well as transparency about sources and ownership of training data for generative AI systems. The California legislature passed a law focused on large AI systems that prohibited a developer from making an AI system available if it presented an “unreasonable risk” of causing or materially enabling “a critical harm.” This law was subsequently vetoed by California Governor Newsome as “well-intentioned” but nonetheless flawed.

While the U.S. AI legal framework – whether in the states or under the new administration – an organization using AI technology must consider how novel issues like the ones raised in Character Tech present new risks.

Daniel Stephen, Naija Perry, and Aden Hochrun contributed to this article

California Court to PGA Tour Caddies: You'll Get Nothing and Like It!

As the full swing of the PGA season rounds the corner, and with the azaleas blooming at Augusta, the trusted confidants of golf’s premier players have already missed the cut.

Last month, the District Court for the Northern District of California dismissed a lawsuit filed against the PGA Tour by a group of 168 caddies contending that the Tour may not require them to wear shoulder-to-thigh length “bibs” during tournaments, many of which feature the name of the golfer for whom the caddie works (on the back) and the names and logos of tournament sponsors (on the front) (Hicks v. PGA Tour, Inc., 2016 WL 928728 (N.D. Cal. Feb. 9, 2016)). Among other arguments, the caddies alleged that the Tour missed the fairway and violated their “right of publicity” by using them as “human billboards” for tournament sponsors without compensation.

California, like many other states, recognizes both a statutory and a common law right of publicity. In California, to state a claim for common law misappropriation in violation of the right of publicity, a plaintiff must allege that defendant used the plaintiff’s name, likeness, or identity without plaintiff’s consent, to the defendant’s advantage, causing harm to plaintiff. The caddies argued that they had never consented to the Tour’s use of their “likeness and images” in connection with the corporate-sponsored bibs during television broadcasts of tournaments. Lawyers for the caddies estimated the value of chest-front advertising on caddie bibs at $50 million per Tour season, of which the caddies received no cut.

U.S. District Judge Vince Chhabria dismissed the caddies’ lawsuit last month with prejudice, writing that “(t)he caddies’ overall complaint about poor treatment by the Tour has merit, but this federal lawsuit about bibs does not.” The court’s ruling relied heavily on the contract that each caddie must sign with the Tour to work an event. The form contract provides that “(c)addies shall wear uniforms…as prescribed by the host tournament and the PGA TOUR,” but does not explicitly require a caddie to wear a tournament bib. The caddies argued that the contract’s particular silence as to bibs precludes the Tour from requiring the caddies to wear the advertisement-laden smocks between the ropes. As a matter of contract interpretation, Judge Chhabria cited the general rule that even where disputed contract language appears ambiguous, the ambiguity can be resolved as a matter of law where context reveals that the language is susceptible to only one interpretation. The court found context in the caddies’ own admission that the Tour has required them to wear bibs for decades as the primary part of their “uniform.” Therefore, concluded Judge Chhabria, the only reasonable interpretation of the contract is that caddies agreed the Tour could make them wear bibs.

Resting upon this interpretation of the Tour contract, the court ruled that the critical element in the caddies’ right of publicity violation claim was not satisfied, namely, a lack of consent. Because the court interpreted the caddie contract as requiring the caddies to wear bibs, and when read with a provision of the contract whereby caddies assign to the Tour their “individual television, radio, motion picture, photographic, electronic … and all other similar or related media rights” with respect to their participation in Tour events, the court concluded that the caddies consented to the use of their images at tournaments, including any logo on the bibs. Thus, tapping in an easy two-foot putt, the court dismissed the caddies’ claim relating to the right control the commercial use of their likenesses.

Even if the district court’s decision is upheld on appeal, all is not lost. Caddies still possess a long game and can always individually negotiate with sponsors to endorse products and place advertisements on other highly visible parts of the uniform, such as hats and shirt sleeves. Further, the court apparently did find some merit in the caddies’ allegations of poor treatment by the Tour, which may earn them a few strokes in the court of public opinion. So, they got that going for them, which is nice.

© 2016 Proskauer Rose LLP.

California Court to PGA Tour Caddies: You’ll Get Nothing and Like It!

As the full swing of the PGA season rounds the corner, and with the azaleas blooming at Augusta, the trusted confidants of golf’s premier players have already missed the cut.

Last month, the District Court for the Northern District of California dismissed a lawsuit filed against the PGA Tour by a group of 168 caddies contending that the Tour may not require them to wear shoulder-to-thigh length “bibs” during tournaments, many of which feature the name of the golfer for whom the caddie works (on the back) and the names and logos of tournament sponsors (on the front) (Hicks v. PGA Tour, Inc., 2016 WL 928728 (N.D. Cal. Feb. 9, 2016)). Among other arguments, the caddies alleged that the Tour missed the fairway and violated their “right of publicity” by using them as “human billboards” for tournament sponsors without compensation.

California, like many other states, recognizes both a statutory and a common law right of publicity. In California, to state a claim for common law misappropriation in violation of the right of publicity, a plaintiff must allege that defendant used the plaintiff’s name, likeness, or identity without plaintiff’s consent, to the defendant’s advantage, causing harm to plaintiff. The caddies argued that they had never consented to the Tour’s use of their “likeness and images” in connection with the corporate-sponsored bibs during television broadcasts of tournaments. Lawyers for the caddies estimated the value of chest-front advertising on caddie bibs at $50 million per Tour season, of which the caddies received no cut.

U.S. District Judge Vince Chhabria dismissed the caddies’ lawsuit last month with prejudice, writing that “(t)he caddies’ overall complaint about poor treatment by the Tour has merit, but this federal lawsuit about bibs does not.” The court’s ruling relied heavily on the contract that each caddie must sign with the Tour to work an event. The form contract provides that “(c)addies shall wear uniforms…as prescribed by the host tournament and the PGA TOUR,” but does not explicitly require a caddie to wear a tournament bib. The caddies argued that the contract’s particular silence as to bibs precludes the Tour from requiring the caddies to wear the advertisement-laden smocks between the ropes. As a matter of contract interpretation, Judge Chhabria cited the general rule that even where disputed contract language appears ambiguous, the ambiguity can be resolved as a matter of law where context reveals that the language is susceptible to only one interpretation. The court found context in the caddies’ own admission that the Tour has required them to wear bibs for decades as the primary part of their “uniform.” Therefore, concluded Judge Chhabria, the only reasonable interpretation of the contract is that caddies agreed the Tour could make them wear bibs.

Resting upon this interpretation of the Tour contract, the court ruled that the critical element in the caddies’ right of publicity violation claim was not satisfied, namely, a lack of consent. Because the court interpreted the caddie contract as requiring the caddies to wear bibs, and when read with a provision of the contract whereby caddies assign to the Tour their “individual television, radio, motion picture, photographic, electronic … and all other similar or related media rights” with respect to their participation in Tour events, the court concluded that the caddies consented to the use of their images at tournaments, including any logo on the bibs. Thus, tapping in an easy two-foot putt, the court dismissed the caddies’ claim relating to the right control the commercial use of their likenesses.

Even if the district court’s decision is upheld on appeal, all is not lost. Caddies still possess a long game and can always individually negotiate with sponsors to endorse products and place advertisements on other highly visible parts of the uniform, such as hats and shirt sleeves. Further, the court apparently did find some merit in the caddies’ allegations of poor treatment by the Tour, which may earn them a few strokes in the court of public opinion. So, they got that going for them, which is nice.

© 2016 Proskauer Rose LLP.