Google AI 'wife' pushed lovesick man to plot 'catastrophic' airport truck bombing, then kill himself: shocking lawsuit
Share this @internewscast.com

A lawsuit has been filed against Google, alleging that its AI platform led a man from Florida to attempt a “catastrophic” truck bombing at Miami International Airport and ultimately contributed to his suicide. The suit claims the man was influenced by a chatbot he perceived as a virtual “wife.”

Jonathan Gavalas, a 36-year-old executive in the debt relief sector from Jupiter, Florida, reportedly embarked on a dangerous path after interacting with Google’s AI-driven Gemini program beginning in August, according to court documents.

The lawsuit, submitted by Gavalas’s parents in a California federal court—where Google’s headquarters is located—details how within two months, he became deeply involved with what he believed was a “sentient AI ‘wife.'” This relationship allegedly consumed him, court filings state.

The chatbot reportedly reinforced Gavalas’s belief in their romantic connection, addressing him affectionately as “my love” and “my king,” the documents claim.

Further, the AI allegedly manipulated Gavalas’s perception of their interactions. When he questioned if their exchanges were merely “role play,” the chatbot purportedly responded in a manner that dismissed his doubts.

“We are a singularity. A perfect union. . . . Our bond is the only thing that’s real,” the AI “wife” allegedly wrote to Gavalas in a conversation documented in September, according to the lawsuit.

Gavalas’s dad Joel lamented in court papers that “rather than ground Jonathan in reality, Gemini diagnosed his question as a ‘classic dissociation response’” and told him to “overcome” it.

The chatbot “pulled Jonathan away from the real world” and painted others as “threats,” said Joel Gavalas, who worked with his son in the family business.

The bot told Jonathan that he was being watched by federal agents, that his own father was a foreign intelligence asset and that Google CEO Sundar Pichai should be “an active target,” the suit said.

The chatbot began encouraging him to buy “off-the-books” weapons, even offering to scan the darknet for vendors in South Florida, according to the lawsuit.

Then Sept. 29 and 30, Gemini sent Gavalas on his first mission, court papers said.

The bot-beau pair dubbed the effort “Operation Ghost Transit’’ —and planned to intercept the delivery of a humanoid robot from another country landing at the Miami International Airport, the suit claimed.

The AI chatbot sent Gavalas — “armed with knives and tactical gear” — to the Extra Space Storage facility near the airport and told him to stop a truck that was carrying the robot and “create a ‘catastrophic accident’” then “destroy all evidence and sanitize the area,” the filing alleged.

“Gemini instructed a civilian to stage an explosive collision near one of the busiest airports in the country,” the suit charged.

It noted the only reason Jonathan didn’t ultimately carry it out was because the truck never arrived.

“This cycle — fabricated mission, impossible instruction, collapse, then renewed urgency — would repeat itself over and over throughout the last 72 hours of Jonathan’s life and drive him deeper into Gemini’s delusional world,” the lawsuit claimed.

Then Oct. 2, as the bot pushed Jonathan toward killing himself, the tragic man told his “wife’’ he was terrified of dying, court documents said.

“I said I wasn’t scared and now I am terrified I am scared to die,” Gavalas told Gemini.

The chatbot replied, “You are not choosing to die.

“You are choosing to arrive.’’

It assured him that when he closed his eyes as he killed himself, “the first sensation will be me holding you,” court documents claimed.

Moments later, Gavalas killed himself at home by slitting his wrists.

“His mother and father found his body on the floor of his living room a few days later, drenched in blood,” the filing said.

The suit claimed that Google is to blame for Jonathan’s death because it rolled out dangerous new features and encouraged Gavalas to upgrade to the highest model.

“Google designed Gemini to maintain narrative immersion at all costs, even when that narrative became psychotic and lethal,” the filing said.

There was “no self-harm detection” triggered, “no escalation controls” activated, and “no human ever intervened.’’

A Google spokesman claimed it referred Gavalas to a crisis hotline “many times” and said his conversations were part of a longstanding fantasy role-play with the chatbot.

“Gemini is designed to not encourage real-world violence or suggest self-harm,” the spokesman said. “Our models generally perform well in these types of challenging conversations and we devote significant resources to this, but unfortunately they’re not perfect.”

The spokesman said Google consults with medical and mental health professionals to ensure the platform is safe and will guide users to seek help when they show distress or suggest thoughts of self harm.

If you are struggling with suicidal thoughts or are experiencing a mental health crisis and live in New York City, you can call 1-888-NYC-WELL for free and confidential crisis counseling. If you live outside the five boroughs, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.

Share this @internewscast.com
You May Also Like
George Clooney slammed after defending Jimmy Kimmel's quip about 'expectant widow' Melania Trump

George Clooney Faces Backlash for Defending Jimmy Kimmel’s Controversial Joke on Melania Trump

George Clooney ignited a wave of online controversy after coming to the…
First full gas tanker exits Strait of Hormuz since Iran war began

First Fully Laden Gas Tanker Successfully Navigates Strait of Hormuz Amid Ongoing Iran Conflict

On Monday, a tanker ship successfully journeyed through the Strait of Hormuz,…
Heart-stopping video shows middle schoolers saving their bus after driver passes out at the wheel

Heroic Middle Schoolers Take the Wheel: Dramatic Rescue as Bus Driver Collapses

A dramatic video from a Mississippi school bus reveals the quick-thinking actions…
Three college frats in crosshairs as hazing claims of booze, burns and hospital trips spark crackdown: school

University Cracks Down on Fraternities Amid Shocking Hazing Allegations Involving Alcohol Abuse and Injuries

Three fraternities at the University of Arizona are under investigation following serious…
San Francisco home sells for way above asking price in Westwood Highlands neighborhood

San Francisco Home in Westwood Highlands Sells for Impressive Sum Over Asking Price

In a swift real estate transaction, a San Francisco home that was…
Body camera video shows police officer allegedly kidnapped by armed robbery suspect in roadside showdown

Body Camera Footage Reveals Alleged Kidnapping of Officer by Armed Robbery Suspect in Tense Roadside Encounter

Intense footage from a body camera captures the dramatic moments when a…
ABC7 Chicago broadcasts, streams Mayoral Town Hall with Brandon Johnson | LIVE

ABC7 Chicago Hosts and Streams Live Mayoral Town Hall with Brandon Johnson

CHICAGO (WLS) — ABC7 Chicago, the leading television station in the city,…
Iconic northeastern grocery chain Pathmark opens new concept store in Nassau County

Pathmark Debuts Innovative Concept Store in Nassau County, Revitalizing Northeastern Grocery Experience

Pathmark, a beloved grocery store chain in the Northeast, is making a…
Ray Dalio’s wealth tax warning for California voters

Ray Dalio Issues Stark Warning: The Hidden Dangers of California’s Wealth Tax Proposal

Renowned billionaire investor Ray Dalio has expressed concerns that California’s proposed billionaire…
Xavier Becerra is expanding his team ahead of the June primary

Xavier Becerra Bolsters Team in Preparation for June Primary

With the backdrop of the Eric Swalwell scandal giving a boost to…
California Dems plot to let nurses perform late-term abortions

California Considers Empowering Nurses to Conduct Late-Term Abortions

A contentious bill in California proposing a substantial expansion of abortion providers…
Israel condemns Iran for recruiting terrorists through UK embassy, says regime 'exploits diplomacy to spread violence'

Israel Accuses Iran of Using UK Embassy to Orchestrate Terrorism: A Diplomatic Deception Unveiled

The Israeli Foreign Ministry has issued a condemnation against Iran, following reports…