Man accidentally poisons himself after following diet recommended by ChatGPT that made him hallucinate
Share this @internewscast.com

A man in Washington accidentally poisoned himself after following a diet made by ChatGPT.

The unnamed man, 60, rushed to his local emergency room with suspicions that his neighbor was poisoning him.    

About a day after being admitted to the hospital, he also suffered paranoia and hallucinations and attempted to escape from the hospital. 

The man later revealed he had several dietary restrictions, including distilling his own water and following an ‘extremely restrictive’ vegetarian diet. 

He told doctors after reading about the harms of sodium chloride, or table salt, he asked ChatGPT about eliminating it from his diet. 

The chatbot reportedly advised him it was safe to replace salt with sodium bromide, which was used as a sedative in the early 20th century and is now found in anticonvulsants for dogs and humans. 

He ended up following this recommendation for three months and eventually developed bromism, or bromide poisoning. 

Bromide can accumulate in the body and impair nerve function, a condition called bromism. This leads to confusion, memory loss, anxiety, delusions, rashes and acne, which the man also had. 

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

Doctors treating the man, from the University of Washington in Seattle, replicated his search and got the same incorrect advice.

They warned that the case highlighted ‘how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.’

They said ChatGPT and other chatbots could ‘generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.’ 

The anonymous case study, published earlier this month in the Annals of Internal Medicine, comes one week after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns.’

However, ChatGPT’s guidelines state it is not ‘intended for use in the diagnosis or treatment of any health condition.’

The patient appeared to have an earlier version of the software. 

After attempting to escape from the hospital, the man was put on an involuntary psychiatric hold and given large amounts of fluids and electrolytes to help flush the bromide out of his system. 

His bromide level was at 1,700 mg/L, while the normal range is between 0.9 and 7.3 mg/L. 

Bromide was used as a sedative in the 19th and 20th centuries and was once widespread in prescription and over-the-counter drugs. However, as research uncovered the risk of chronic exposure, regulators gradually began removing them from the US drug supply. 

As a result, cases today remain few and far between. 

The man also reported distilling his own water and maintaining an 'extremely restrictive' vegetarian diet (stock image)

The man also reported distilling his own water and maintaining an ‘extremely restrictive’ vegetarian diet (stock image)

The case study comes after OpenAI claimed one of the chatbot's newest upgrades could be better at answering health-related questions and 'flagging potential concerns'

The case study comes after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns’

The man reported acne and small red growths on his skin, insomnia, fatigue, muscle coordination issues and excessive thirst. 

It took three weeks for his bromide levels to stabilize and for him to be weaned off psychiatric medications before he was able to be discharged. 

The doctors treating him wrote: ‘While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.

‘It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.’

They also emphasized that ‘as the use of AI tools increases, providers will need to consider this when screening for where their patients are consuming health information.’ 

Share this @internewscast.com
You May Also Like

Important Health Alert for Families with Pets or Kids: Germ Awareness Tips

Individuals with pets or young children at home have received an important…

Relieve Sore Throat Fast with This Simple 3-Ingredient Home Remedy

No one wishes to rise on Christmas morning feeling under the weather.…

Speed Up Cold & Flu Recovery: Proven Tips to Heal Faster and Prevent Spreading

Clearing cold and flu viruses from your sinuses might be a key…

NHS Issues Alert on Recognizing ‘Warning Signs’ in Relatives During the Holiday Season

As families come together for Christmas, health experts are urging people to…

Consult Your GP for Medication if You Experience ‘Blue and Painful’ Fingers

Individuals dealing with a prevalent winter ailment might find relief by consulting…

How I Advocated for My Baby’s Health: Uncovering the Brain Condition Doctors Overlooked

In the heart of Kentucky, a mother’s instincts were put to the…

Feeling Thirsty All the Time? NHS Urges You to Take This Seriously

The NHS has provided guidance for individuals experiencing dry mouth or an…

Shocking Ozempic Side Effects: One Woman’s Christmas Dinner Warning Reveals Startling Truths

Allison Rankin eagerly anticipated her family’s traditional Christmas gathering. In 2022, she…

Health Alert: Doctor Reveals Popular Christmas Gift That Could Pose Health Risks

A health authority has recently cautioned against a beloved holiday gift item.…

Beat Festive Indigestion: Expert Tips for a Smooth, Pill-Free Christmas Lunch

For many in the UK, the joy of Christmas is often overshadowed…

Holiday Cookie Recall Alert: Deadly Ingredients Prompt Urgent Safety Warnings

Two bakeries have issued recalls for their holiday cookie products due to…

Festive Feasting: Ozempic and Biohacking-Friendly Christmas Meals Inspired by Bryan Johnson

For many, Christmas is a rare opportunity to throw dietary caution to…