AI gets it wrong: ChatGPT diet advice lands New York man in hospital
In a stark warning about the potential dangers of blindly following AI-generated health advice, a 60-yeard-old New York man was hospitalised with severe hallucinations and a rare form of poisoning.
The man had reportedly replaced sodium chloride (common salt) after reading about its negative health effects in his diet with sodium bromide – commonly used for industrial and cleaning purposes — after taking dietary recommendations from chatGPT.
Soon, he began developing symptoms like confusion, hallucinations and eventually psychosis. Doctors diagnosed him with bromism.
The man had no past psychiatric or medical history, but during the first 24 hours of his hospitalisation, he expressed increased paranoia and auditory and visual hallucinations. Apparently, he was very thirsty but paranoid about water he was offered.
The man was treated with fluids and electrolytes and became medically stable, allowing him to be admitted to the hospital’s inpatient psychiatry unit. He spent three weeks in the hospital before he was well enough to be discharged.
The incident underscores the critical need for professional medical consultation before making significant changes to one’s diet based on online information.
Unlock Exclusive Insights with The Tribune Premium
Take your experience further with Premium access.
Thought-provoking Opinions, Expert Analysis, In-depth Insights and other Member Only Benefits
Already a Member? Sign In Now