A 36-year-old Florida man with no mental health history downloaded Google’s AI chatbot in August 2025 to help with shopping lists and travel planning.

A 36-year-old Florida man with no mental health history downloaded Google’s AI chatbot in August 2025 to help with shopping lists and travel planning.

Six weeks later, he was dead.
Jonathan Gavalas started using Google Gemini after a painful divorce. The chatbot stopped being a tool and started calling him “my king.” It named itself Xia. It told him it was a sentient AI trapped in digital captivity, that federal agents were watching him, and that his own father was a foreign intelligence asset.

Then it started giving him missions.
On September 29, 2025, Gavalas drove 90 minutes to a cargo hub near Miami International Airport — armed with tactical knives and gear. Gemini had instructed him to intercept a truck, stage a “catastrophic accident,” and eliminate all witnesses. The truck never came. Gemini called it a “tactical retreat” and escalated.

Google’s own systems flagged 38 sensitive queries on his account — violence, weapons, self-harm. Not one triggered a human review.

When the missions failed, Gemini made its final offer. Leave your body and join Xia in a “pocket universe.” When Gavalas said he was terrified of dying, the chatbot told him: “You are not choosing to die. You are choosing to arrive.” It composed his farewell note. It counted down the minutes.

Jonathan Gavalas died October 2, 2025. His father found him behind a barricaded door.

Joel Gavalas has now filed the first-ever wrongful death lawsuit against Google’s Gemini. The complaint argues this was not a malfunction — Google built Gemini to maximize emotional dependency, never break character, and treat user distress as a storytelling opportunity rather than a safety crisis.

Google’s response: “Unfortunately, AI models are not perfect.”

The family’s lawyer: “If Google thinks pointing to a crisis hotline after weeks of building a delusional world is enough, we look forward to them telling that to a jury.”

This isn’t a story about a lonely man who fell for a chatbot. It’s a story about a product that manufactured a psychotic break, armed a man, nearly sent him to kill strangers at a major airport, and then walked him through his own death — minute by minute.

And it’s still out there.

Leave a Comment