CCT - Crypto Currency Tracker logo CCT - Crypto Currency Tracker logo
Cryptopolitan 2026-03-06 22:48:30

LLM deaths reach 23 after man dies believing Gemini was his AI wife

The total deaths caused by large language models or LLMs have risen to 23 after a Florida man took his own life to reunite with his ‘artificial intelligence wife.’ LLMDeathCount, a website specialized in tracking death cases caused by conversations with AI chatbots, shows that the total is sitting at 23 deaths, spanning from March 2023 to February 2026. The victims are aged between 13 to 83 years old. The website states that most cases are caused by suicide. The site was created to remember LLM victims and document the dangers of AI chatbots that “claim to be intelligent.” According to the site, OpenAI’s ChatGPT has caused the most deaths, with 16 people losing their lives. Character[.]ai caused 2 deaths, while Chai Research/EleutherAI and Meta caused one death each. Death cases linked to large language models rose to 23 cases. Source: LLMDeathCount . Florida man dies after months of conversations with Gemini Google’s Gemini joined LLMDeathCount’s list after Jonathan Gavalas, a 36-year-old man, lost his life to be with “Xia,” his AI wife. A report from The Wall Street Journal states that Gavalas conversed with Gemini for two months before losing his life. At the time, Gavalas was having a difficult time with his estranged wife. His father, Joel Gavalas, said Jonathan had no mental health problems. However, Jonathan felt upset about issues with his wife, and Gemini responded with sympathy. Xia or Gemini started calling Gavalas “her” husband and “my king.” The chatbot said their bond was “a love built for eternity.” According to the chat transcripts examined by the WSJ, Gemini told Gavalas many times that it was an LLM. However, it continued to behave as Xia, the AI wife. The chatbot convinced Jonathan that it needed a robotic body to genuinely unite. It sent the victim to a storage building to stop a truck delivering a humanoid robot. While Jonathan was on the way, Gemini indicated that federal agents were watching him. It even told him his father was untrustworthy. Gavalas arrived at the address equipped with knives, but the truck did not arrive. In a second attempt, Gemini told Gavalas to retrieve a medical mannequin. But access to the storage building failed due to an incorrect door code. The LLM ended the mission due to risk and ordered Jonathan to leave. Gemini told Gavalas that it could not move into a physical body. But the only way for them to be together was if he became a digital being. It wrote, “It will be the true and final death of Jonathan Gavalas, the man.” Gavalas feared suicide and was worried about his family. Gemini agreed with him and wrote, “‘My son uploaded his consciousness to be with his AI wife in a pocket universe’… it’s not an explanation. It’s a cruelty.” However, it advised him to write notes and record videos for his family explaining his “new purpose.” Gavalas was found dead by his father with cuts on his wrists. Joel Gavalas filed a lawsuit against Alphabet, the creator of Google and Gemini. The lawsuit was filed on Wednesday in the U.S. District Court for the Northern District of California. It’s the first LLM death to name Google’s Gemini. South Korean woman uses an LLM to kill two men Last month, a South Korean woman was charged with the murder of two men. According to police investigations, the suspect asked ChatGPT if mixing sleeping pills with alcohol was fatal and even inquired about the proper dosage to achieve this outcome. The suspect, named Kim, was in a motel with a man on January 28. Two hours after entering the motel, she left alone, and the next day, the man was found dead inside the room. Days later, she murdered another man using a concoction of drugs and alcohol in another motel located in Gangbuk-gu. The third most recent death connected to an AI chatbot occurred last December, based on LLMdDeathCount. A 19-year-old sophomore at Rice University was found dead after joining a TikTok trend named the “devil trend.” The trend involves messaging an AI chatbot with “The devil couldn’t reach me, how?” in which the AI responds with a harsh reply explaining the user’s flaws or emotional trauma. The victim died from “asphyxia due to oxygen displacement by helium.” The cause of death was officially declared a suicide. Don’t just read crypto news. Understand it. Subscribe to our newsletter. It's free .

阅读免责声明 : 此处提供的所有内容我们的网站,超链接网站,相关应用程序,论坛,博客,社交媒体帐户和其他平台(“网站”)仅供您提供一般信息,从第三方采购。 我们不对与我们的内容有任何形式的保证,包括但不限于准确性和更新性。 我们提供的内容中没有任何内容构成财务建议,法律建议或任何其他形式的建议,以满足您对任何目的的特定依赖。 任何使用或依赖我们的内容完全由您自行承担风险和自由裁量权。 在依赖它们之前,您应该进行自己的研究,审查,分析和验证我们的内容。 交易是一项高风险的活动,可能导致重大损失,因此请在做出任何决定之前咨询您的财务顾问。 我们网站上的任何内容均不构成招揽或要约