Man accidentally poisons himself after following diet recommended by ChatGPT that made him hallucinate
Share this @internewscast.com

A man in Washington accidentally poisoned himself after following a diet made by ChatGPT.

The unnamed man, 60, rushed to his local emergency room with suspicions that his neighbor was poisoning him.    

About a day after being admitted to the hospital, he also suffered paranoia and hallucinations and attempted to escape from the hospital. 

The man later revealed he had several dietary restrictions, including distilling his own water and following an ‘extremely restrictive’ vegetarian diet. 

He told doctors after reading about the harms of sodium chloride, or table salt, he asked ChatGPT about eliminating it from his diet. 

The chatbot reportedly advised him it was safe to replace salt with sodium bromide, which was used as a sedative in the early 20th century and is now found in anticonvulsants for dogs and humans. 

He ended up following this recommendation for three months and eventually developed bromism, or bromide poisoning. 

Bromide can accumulate in the body and impair nerve function, a condition called bromism. This leads to confusion, memory loss, anxiety, delusions, rashes and acne, which the man also had. 

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

Doctors treating the man, from the University of Washington in Seattle, replicated his search and got the same incorrect advice.

They warned that the case highlighted ‘how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.’

They said ChatGPT and other chatbots could ‘generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.’ 

The anonymous case study, published earlier this month in the Annals of Internal Medicine, comes one week after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns.’

However, ChatGPT’s guidelines state it is not ‘intended for use in the diagnosis or treatment of any health condition.’

The patient appeared to have an earlier version of the software. 

After attempting to escape from the hospital, the man was put on an involuntary psychiatric hold and given large amounts of fluids and electrolytes to help flush the bromide out of his system. 

His bromide level was at 1,700 mg/L, while the normal range is between 0.9 and 7.3 mg/L. 

Bromide was used as a sedative in the 19th and 20th centuries and was once widespread in prescription and over-the-counter drugs. However, as research uncovered the risk of chronic exposure, regulators gradually began removing them from the US drug supply. 

As a result, cases today remain few and far between. 

The man also reported distilling his own water and maintaining an 'extremely restrictive' vegetarian diet (stock image)

The man also reported distilling his own water and maintaining an ‘extremely restrictive’ vegetarian diet (stock image)

The case study comes after OpenAI claimed one of the chatbot's newest upgrades could be better at answering health-related questions and 'flagging potential concerns'

The case study comes after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns’

The man reported acne and small red growths on his skin, insomnia, fatigue, muscle coordination issues and excessive thirst. 

It took three weeks for his bromide levels to stabilize and for him to be weaned off psychiatric medications before he was able to be discharged. 

The doctors treating him wrote: ‘While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.

‘It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.’

They also emphasized that ‘as the use of AI tools increases, providers will need to consider this when screening for where their patients are consuming health information.’ 

Share this @internewscast.com
You May Also Like

Recurring UTIs: Could They Be an Early Indicator of Kidney Cancer? Insights from a Leading Nurse

Suffering from recurrent urinary tract infections could be a sign of kidney…

Doctor cautions against consuming these five ‘healthy’ foods

A weight-loss specialist has disclosed the five foods he avoids stocking in…

Experts Find Alzheimer’s Medication May Enhance Communication Skills in Autistic Teens

A pill designed to treat Alzheimer’s could help treat teens with autism…

Government Shutdown Puts Millions at Risk of Stroke and Heart Attack by Halting Essential Medical Services

Across America, individuals at risk of a health crisis now face heightened…

Study Suggests UK Residents Take Better Care of Their Cars Than Their Health

Research indicates that Britons are tending to their cars more diligently than…

UK Physician Alerts NHS Patients About Health Risks Associated with Common Medication

A doctor has issued a stark warning to those who take a…

Autism Affected My Speech, But My Parents’ Unique Approach Helped Me Become a Leading News Anchor

Leland Vittert is living the dream. At 43, he has covered war…

Pharmacist Reveals Three Health Habits to Skip — Including One That’s Surprisingly Common

A pharmacist has shared three things she would ‘never do,’ and one…

After losing 70lbs on Ozempic, an unexpected side effect is causing frustration

A weight-loss drug user has revealed an unexpected side-effect after shifting the pounds.  The…

My Father Was Diagnosed with the Same Dementia as Bruce Willis at 59: Doctors Attribute it to a Common Health Issue That Might Have Been Prevented

For Katryna Rogers, her father’s dementia diagnosis confirmed her worst fears. Robert…