Man accidentally poisons himself after following diet recommended by ChatGPT that made him hallucinate
Share this @internewscast.com

A man in Washington accidentally poisoned himself after following a diet made by ChatGPT.

The unnamed man, 60, rushed to his local emergency room with suspicions that his neighbor was poisoning him.    

About a day after being admitted to the hospital, he also suffered paranoia and hallucinations and attempted to escape from the hospital. 

The man later revealed he had several dietary restrictions, including distilling his own water and following an ‘extremely restrictive’ vegetarian diet. 

He told doctors after reading about the harms of sodium chloride, or table salt, he asked ChatGPT about eliminating it from his diet. 

The chatbot reportedly advised him it was safe to replace salt with sodium bromide, which was used as a sedative in the early 20th century and is now found in anticonvulsants for dogs and humans. 

He ended up following this recommendation for three months and eventually developed bromism, or bromide poisoning. 

Bromide can accumulate in the body and impair nerve function, a condition called bromism. This leads to confusion, memory loss, anxiety, delusions, rashes and acne, which the man also had. 

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

Doctors treating the man, from the University of Washington in Seattle, replicated his search and got the same incorrect advice.

They warned that the case highlighted ‘how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.’

They said ChatGPT and other chatbots could ‘generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.’ 

The anonymous case study, published earlier this month in the Annals of Internal Medicine, comes one week after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns.’

However, ChatGPT’s guidelines state it is not ‘intended for use in the diagnosis or treatment of any health condition.’

The patient appeared to have an earlier version of the software. 

After attempting to escape from the hospital, the man was put on an involuntary psychiatric hold and given large amounts of fluids and electrolytes to help flush the bromide out of his system. 

His bromide level was at 1,700 mg/L, while the normal range is between 0.9 and 7.3 mg/L. 

Bromide was used as a sedative in the 19th and 20th centuries and was once widespread in prescription and over-the-counter drugs. However, as research uncovered the risk of chronic exposure, regulators gradually began removing them from the US drug supply. 

As a result, cases today remain few and far between. 

The man also reported distilling his own water and maintaining an 'extremely restrictive' vegetarian diet (stock image)

The man also reported distilling his own water and maintaining an ‘extremely restrictive’ vegetarian diet (stock image)

The case study comes after OpenAI claimed one of the chatbot's newest upgrades could be better at answering health-related questions and 'flagging potential concerns'

The case study comes after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns’

The man reported acne and small red growths on his skin, insomnia, fatigue, muscle coordination issues and excessive thirst. 

It took three weeks for his bromide levels to stabilize and for him to be weaned off psychiatric medications before he was able to be discharged. 

The doctors treating him wrote: ‘While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.

‘It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.’

They also emphasized that ‘as the use of AI tools increases, providers will need to consider this when screening for where their patients are consuming health information.’ 

Share this @internewscast.com
You May Also Like

Study Advocates Heart Screenings for Youth to Mitigate Sudden, Asymptomatic Fatalities

Experts and advocates are urging for heart screenings to be made available…

Discover Sleep Patterns That May Indicate Autism Traits

Recent research suggests that observing specific sleep patterns could be instrumental in…

Urgent Health Alert: Frequent Handwashing Crucial as Virus Forces Hospital Ward Closure

Health authorities have issued a warning that certain preventive measures are ineffective,…

Discover the Hidden Side Effects of Weight-Loss Injections for Middle-Aged Women – and the Little-Known Solution

As a medical professional, I navigate two distinct realms. My primary training…

Discover the Surprising Health Benefit of Eating Just 3 Tablespoons of Peanut Butter Daily

Incorporating three tablespoons of peanut butter daily could enhance muscle strength in…

Petra Ecclestone’s Husband Faces Health Concern at Age 42

Sam Palmer, the husband of Formula 1 heiress Petra Ecclestone, has raised…

Top 5 Early Measles Symptoms Every Parent Should Watch For Amid School Outbreaks

There are some early symptoms of measles that aren’t a rash (Image:…

Uncovering the Hidden Truth: How Misdiagnosed Thrush Led to Vulvar Cancer – Essential Symptoms Women Must Recognize

A young woman has revealed how her symptoms of a rare form…