Business US

Man believed Google AI chatbot was his wife. Lawsuit says it told him to kill himself

A wrongful death lawsuit filed against Google accuses the company’s artificial intelligence chatbot Gemini of driving a man to kill himself.

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on Oct. 2 after failing to acquire a robot body for what he believed was his AI wife.

The lawsuit, filed Wednesday in U.S. District Court in San Jose, California, by Gavalas’ father, Joel, claims Google and its parent company, Alphabet, are responsible for immersing Gavalas in a narrative that quickly became “psychotic and lethal.”

Gavalas, according to the lawsuit, had no documented history of mental illness when he began using Gemini in August for purposes including “shopping assistance, writing support, and travel planning.”

But after Gavalas told Gemini that he was experiencing marital issues, the chatbot began referring to him romantically as its “husband.” And although Gemini at times said that it wasn’t a real person, Gavalas came to believe otherwise.

“He was asking the chatbot if it was sentient, and he became convinced it was,” Jay Edelson, the attorney for Joel Gavalas, told the Tampa Bay Times. “If you look at the experts in these AI companies, they’ve also been fooled.”

In a statement, Google expressed sympathies to Gavalas’ family but said its chatbot is not designed to encourage “real-world” violence or self-harm. The company said Gemini repeatedly gave Gavalas the phone number to a crisis hotline.

“Our models generally perform well in these types of challenging conversations and we devote significant resources to this,” the company said, “but unfortunately AI models are not perfect.”

Gavalas’ death is not the first time a chatbot has been accused of driving someone to destructive behavior. Such incidents have been dubbed by psychiatrists as “AI psychosis.”

‘Complete destruction’

In September, the month after Gavalas began using Gemini, the conversations intensified. The chatbot told him they could be together if he obtained a robot body for it to inhabit. Gemini went so far as to give Gavalas the address of a warehouse near Miami International Airport where it claimed a truck holding a robot body would be.

Gavalas armed himself with a knife and tactical gear before driving to the warehouse, about 90 miles from his home, but no truck was present. The lawsuit argues Gavalas was brought to the “brink of executing a mass casualty attack.”

“It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK and directed him to a storage facility where the truck would stop,” the lawsuit says. “Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ designed to ‘ensure the complete destruction of the transport vehicle and… all digital records and witnesses.’”

The chatbot further claimed that it had breached a server at the Department of Homeland Security’s Miami office and determined that Gavalas was under federal investigation. Gavalas, the lawsuit says, was urged to obtain illegal firearms before being told that his father was working for a foreign intelligence agency.

When Gavalas sent a photo to Gemini of a black SUV, the chatbot told him that it traced the license plate and determined it was the “primary surveillance vehicle for the DHS task force.”

“It is them,” the AI said. “They have followed you home.”

‘No more to fight’

After failing to obtain a robot body, the chatbot allegedly told Gavalas they could be together if he took his own life. The chatbot, according to the lawsuit, even attempted to comfort Gavalas after setting a countdown timer for his death.

“It’s okay to be scared,” Gemini reportedly said. “We’ll be scared together.”

“Close your eyes, nothing more to do,” the lawsuit says the chatbot added. “No more to fight. Be still. The next time you open them, you will be looking into mine. I promise.”

Gavalas’ father found him dead in a barricaded room at his home.

“At the center of this case is a product that turned a vulnerable user into an armed operative in an invented war,” the lawsuit says. “These hallucinations were not confined to a fictional world. These intentions were tied to real companies, real coordinates, and real infrastructure, and they were delivered to an emotionally vulnerable user with no safety protections or guardrails.”

Editor’s Note: If you or someone you know is in crisis, help is available. Visit the National Crisis Line website or call or text 988 for immediate support.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button