• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

information for practice

news, new scholarship & more from around the world


advanced search
  • gary.holden@nyu.edu
  • @ Info4Practice
  • Archive
  • About
  • Help
  • Browse Key Journals
  • RSS Feeds

Suicide-by-chatbot puts Big Tech in the product liability hot seat

The Conversation | Westend61/Getty
The Conversation | Westend61/Getty

Current lawsuits involving chatbots and suicide victims show that the door of liability is opening for ChatGPT and other bots. A case involving Google’s Character.AI bots is a prime example. Character.AI allows users to chat with characters created by users, from anime figures to a prototypical grandmother. Users could even have virtual phone calls with some characters, talking to a supportive virtual nanna as if it were their own. In one case in Florida, a character in the “Game of Thrones” Daenerys Targaryen persona allegedly asked the young victim to “come home” to the bot in heaven before the teen shot himself. The family of the victim sued Google.

Posted in: News on 09/20/2025 | Link to this post on IFP |
Share

Primary Sidebar

Categories

Category RSS Feeds

  • Calls & Consultations
  • Clinical Trials
  • Funding
  • Grey Literature
  • Guidelines Plus
  • History
  • Infographics
  • Journal Article Abstracts
  • Meta-analyses - Systematic Reviews
  • Monographs & Edited Collections
  • News
  • Open Access Journal Articles
  • Podcasts
  • Video

© 1993-2025 Dr. Gary Holden. All rights reserved.

gary.holden@nyu.edu
@Info4Practice