He replaces salt on chatgpt advice: poisoned in bromine, he ends up in the emergency room

He replaces salt on chatgpt advice: poisoned in bromine, he ends up in the emergency room
He wanted to reduce salt in his diet. But by following a chatgpt advice, a 64 -year -old American replaced sodium chloride with a bromine -based compound. Result: a serious poisoning and an emergency room, with impressive neurological symptoms.

Brumism is a rare poisoning due to the accumulation of bromine in the body, causing cognitive disorders, hallucinations, rashes and metabolic disorders.

A sixties arrives at emergency with delusional symptoms

The man arrived at the hospital with mental confusion, paranoia and hallucinations. According to the report published in theAnnals of Internal Medicinehe “claimed that his neighbor perhaps poisoned him. He also said he had multiple food restrictions “. Despite an intense thirst, he refused the water offered by the staff, fearing that they were contaminated.

Within 24 hours of his admission, he tried to escape from the hospital and, after his hospitalization, was treated for psychosis. Once stabilized, he reported other symptoms typical of brumism: facial acne, excessive thirst and insomnia. The analyzes have revealed “A serum brome concentration of 2260 mg/l (while normal rates are less than 10 mg/l) “.

Broma: forgotten intoxication, reappeared following Chatgpt advice?

Broma, widely documented in the 19th and early 20th century, occurred when bromine salts were used as sedatives or anticonvulsants. It causes sometimes severe neurological, psychiatric and metabolic disorders.

In this case, doctors discovered that man had replaced his table salt with potassium bromide powder, a product not intended for food consumption, after consulting Chatgpt. According to the article: “The patient said he replaced table salt with powdered potassium bromide, based on a recommendation provided by a large model of language “.

The authors specify that in view of the chronology of the facts, the patient probably used Chatgpt 3.5 or 4.0. They note that by questioning Chatgpt 3.5 themselves on the replacement of chloride, they obtained a response citing bromide as an alternative, “Without specific health warning, nor questioning the reason for demand “.

Chatgpt and health: a powerful tool, but not without danger

The authors warn: “Large language models can generate scientific inaccuracies, do not be able to critical of results and, ultimately, fuel the disinformation disinfusion “.

They recall that it is “Highly improbable that a medical expert mentioned sodium bromide in the face of a patient looking for a substitute viable to sodium chloride “.

For them, if AI has great potential to bring science closer to the general public, it also has a major risk: that of disseminating decontextualized information. “”As the use of AI tools increases, healthcare professionals must take this aspect into account when they examine where their patients find their medical information “.