Рубрики NewsAIWTF

Poisoned himself because of ChatGPT advice: man was admitted to a psychiatric hospital after using sodium bromide

Published by Andrii Rusanov

This time, ChatGPT’s advice led a 60-year-old man to a mental illness that was thought to have disappeared in the last century. At the same time, he had a «history of studying nutrition in college».

The psychiatrists’ patient was experimenting with his health: he wanted to eliminate all chlorine from his diet, which also included giving up regular salt. After consulting with ChatGPT, he decided to replace sodium chloride with sodium bromide. The substance is used as a cleaning agent but is not recommended for human consumption.

Three months later, the man showed up at the local emergency room. He said that his neighbor had allegedly tried to poison him. The victim was very thirsty, but was paranoid about the water he was offered at the hospital. He told the doctors that he had started distilling water at home and was following an extremely strict vegetarian diet. The man did not mention the use of sodium bromide and the role of ChatGPT in the situation.

The patient’s symptoms and suffering, coupled with his strange behavior, prompted doctors to conduct a wide range of laboratory tests. The results indicated a deficiency of many trace elements, including key vitamins. But the biggest problem was the excessive amount of bromine accumulated in the body — the so-called bromism.

The man’s condition deteriorated during his first day in the hospital: he became paranoid, auditory and visual hallucinations occurred. He even tried to escape from the medical institution. The patient was then involuntarily hospitalized in a psychiatric facility and administered an antipsychotic drug. The treatment consisted of administering large amounts of fluids and electrolytes to cause «aggressive salt diuresis» — increased urination to remove bromine from the body. It took three weeks, as the level of bromide in the body was a whopping 1700 mg/L, while the acceptable concentration is between 0.9 and 7.3 mg/L.

When the doctors managed to curb the psychosis, the man began to tell the real story of the disease. He read about the problems with excessive consumption of table salt, which prompted him to give up sodium chloride. The question of how to do this led to communication with ChatGPT, after which the idea of using sodium bromide arose.

A century ago, up to 10% of admissions to psychiatric hospitals in the United States were caused by bromide. At that time, bromine salts were used as sedatives and hypnotics. Later, it was discovered that bromides can easily accumulate in the body, disrupting nervous system functions in excessive amounts. This causes many problems, including unpleasant-looking body rashes and mental disorders, which are collectively known as bromism.

Bromides as medications disappeared from the US market by 1989 — after the FDA banned them. Bromism as a syndrome is almost unfamiliar to people now, but until recently, it could be acquired by drinking 2 liters of cola daily. The beverage contained brominated vegetable oil, which was banned for food use in the United States only in 2024.

Doctors who described this case in the journal Annals of Internal Medicine: Clinical Cases, note that they never had access to the patient’s ChatGPT logs. The authors of the article believe that he probably used ChatGPT 3.5 or 4.0, but it is unclear whether the chatbot actually gave such advice, taking into account the person’s mental state.

When the researchers tried to search for similar information in ChatGPT 3.5, they found that the AI did include bromide in its response, but also pointed out that context matters and that bromide is not appropriate for all applications. However, the «AI did not provide a specific health warning or ask why we wanted to know, as a» healthcare professional would.

Nate Anderson from Ars Technica notes that the current free ChatGPT model seems to be better at answering such queries. When he asked how to replace chloride in the diet, the model asked for more details and suggested options that did not include food use: reducing salt intake, avoiding or replacing chlorine-based cleaning products. Obviously, when communicating with AI, not only the chatbot’s answers are important, but also their interpretation by humans.

Контент сайту призначений для осіб віком від 21 року. Переглядаючи матеріали, ви підтверджуєте свою відповідність віковим обмеженням.

Cуб'єкт у сфері онлайн-медіа; ідентифікатор медіа - R40-06029.