October 5, 2025

The man follows advice on the chatgpt regime, ends up with psychosis

0
Chatgpt-1200x675.jpg


A case study this month offers a mature edifying story for our modern times. Doctors detail how a man underwent a psychosis caused by poison after following the food advice guided by AI.

Doctors from Washington University have documented real life Black mirror Episode in the annals of internal medicine: clinical cases. The man would have developed poisoning from the bromide he had ingested for three months on the recommendation of Chatgpt. Fortunately, his condition has improved with treatment and he managed to recover.

Bromide compounds were once commonly used at the beginning of the 20th century to deal with various health problems, from anxiety insomnia. Finally, however, people realized that bromide could be toxic to high or chronic doses and, ironically, cause neuropsychiatric problems. In the 1980s, bromide had been removed from most drugs and cases of poisoning to bromide, or bromism, fell with it.

However, the ingredient remains in certain veterinary drugs and other consumer products, including food supplements, and the occasional case of bromism still occurs today. This incident, however, could be the first poisoning with bromide powered by AI.

According to the report, the man visited a local emergency room and told staff that he may have been poisoned by his neighbor. Although some of his physique is good, man has become agitated and paranoid, refusing to drink water which gave him even if he was thirsty. He also experienced visual and auditory hallucinations and quickly developed a full -fledged psychotic episode. In the midst of his psychosis, he tried to escape, after which the doctors placed him in an “involuntary psychiatric taking for a serious handicap”.

Doctors administered intravenous liquids and an antipsychotic, and he started stabilizing. They suspected very early that bromism was to blame for man’s disease, and once it was good enough to speak in a coherent way, they discovered exactly how it found itself in its system.

The man told the doctors that he had started to take sodium bromide intentionally three months earlier. He had read the negative health effects of having too much table salt (sodium chloride) in your diet. When he examined the literature, however, he only found advice on how to reduce sodium intake.

“Inspired by his story of studying nutrition in college,” wrote the doctors, the man rather decided to try to withdraw chloride from his diet. He consulted Chatgpt to obtain help and was apparently informed that chloride could be exchanged safely with bromide. With the AI Clair, he started consuming sodium bromide bought online.

Given the calendar of the case, the man had probably used Chatgpt 3.5 or 4.0. Doctors did not have access to man’s cat newspapers, so we will never know exactly how his fateful consultation took place. But when they asked Chatgpt 3.5 with which chloride can be replaced, it returned by a response that included bromide.

It is possible, even probably, that man’s AI refers to examples of replacement of bromide which had nothing to do with the diet, as for cleaning. The doctors’ chatgpt notably indicated in its response that the context of this replacement imported, they wrote. But the AI has also never provided a warning regarding the dangers of bromide consumption, and he did not ask why the person was interested in this question in the first place.

As for the man himself, he slowly recovered from his ordeal. He was finally removed from antipsychotic drugs and left the hospital three weeks after admission. And during a two -week follow -up, he stayed in a stable state.

Doctors have written that if tools like Chatgpt can “provide a bridge between scientists and the non -academic population, AI also involves the risk of promulgating decontextualized information”. With admirable resistance, they added that a human medical expert probably would not have recommended that someone worried about their consumption of table salt consumption.

Honestly, I’m not sure that a living human today gives this advice. And that is why having a decent friend to bounce back our random ideas should remain an essential part of life, whatever the latest version of Chatgpt.


https://gizmodo.com/app/uploads/2025/08/Chatgpt-1200×675.jpg

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *