Share this @internewscast.com
A MAN was hospitalized after consulting with an AI bot to remove table salt from his diet.
A report in a new medical journal described a case involving a 60-year-old patient who required psychiatric hospitalization after seeking advice from ChatGPT regarding dietary modifications he intended to make.
He asked what chloride could be swapped with, the bot telling him bromide.
The journal, Annals of Internal Medicine: Clinical Cases, noted, “Inspired by his background in studying nutrition during college, he embarked on a personal experiment to eliminate chloride from his diet.”
He was “surprised” that advice for removing table salt from a diet detailed how to slow sodium intake, and not chloride.
For three months, instead of consuming sodium chloride, he swapped it out with sodium bromide.
ChatGPT’s recommendation was “likely for other purposes, such as cleaning,” the article said.
He was admitted after believing that his neighbor poisoned him, and had no prior psychiatric problems.
The man told doctors that although he was thirsty, he was scared of drinking water he was given.
After he was admitted, he claimed he was having growing paranoia and hallucinations.
He then tried to escape the hospital, the report says.
Bromism, or bromide toxicity, was more common in the 20th century due to multiple medications including the compound.
The ailment can cause psychiatric and dermatologic symptoms in patients.
After the US Food and Drug Administration phased it out, cases of bromism dropped.
Once doctors put the pieces together, the man was released from the hospital three weeks later.
The journal explained that this situation underscores the potential impact of artificial intelligence (AI) in contributing to avoidable negative health issues.
The trend of AI psychosis
Researchers are studying the growing problem of “AI psychosis.”
This risk arises when individuals become excessively involved in conversations with AI chatbots. While large language models can offer a sense of comfort to those who seek companionship, they can also result in a perception of friendship, romance, and more intricate connections.
The Cognitive Behavior Institute warns that issues arise when users experience feelings of grandiosity, paranoia, detachment, and compulsive interactions with these bots.
The organization stated, “A digital companion is no replacement for professional therapy, and when the boundary between help and obsession becomes unclear, it is crucial for support to be provided by human hands.”
“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”
Doctors couldn’t get access to his chat logs with the bot, and believe he was using either ChatGPT 3.5 or 4.0.
However, when ChatGPT was asked for a chloride substitute, bromide came up for the researchers.
“As the use of AI tools increases, providers will need to consider this when screening where their patients are consuming health information.”
Just last week, OpenAI had announced major advancements to ChatGPT.
They said the new version would be better at answering health questions.