DT
PT
Subscribe To Print Edition About The Tribune Code Of Ethics Download App Advertise with us Classifieds
search-icon-img
search-icon-img
Advertisement

AI gets it wrong: ChatGPT diet advice lands New York man in hospital

Doctors diagnosed him with bromism
  • fb
  • twitter
  • whatsapp
  • whatsapp
featured-img featured-img
iStock
Advertisement

In a stark warning about the potential dangers of blindly following AI-generated health advice, a 60-yeard-old New York man was hospitalised with severe hallucinations and a rare form of poisoning.

Advertisement

The man had reportedly replaced sodium chloride (common salt) after reading about its negative health effects in his diet with sodium bromide – commonly used for industrial and cleaning purposes — after taking dietary recommendations from chatGPT.

Soon, he began developing symptoms like confusion, hallucinations and eventually psychosis. Doctors diagnosed him with bromism.

Advertisement

The man had no past psychiatric or medical history, but during the first 24 hours of his hospitalisation, he expressed increased paranoia and auditory and visual hallucinations. Apparently, he was very thirsty but paranoid about water he was offered.

The man was treated with fluids and electrolytes and became medically stable, allowing him to be admitted to the hospital’s inpatient psychiatry unit. He spent three weeks in the hospital before he was well enough to be discharged.

Advertisement

The incident underscores the critical need for professional medical consultation before making significant changes to one’s diet based on online information.

Advertisement
Advertisement
Advertisement
tlbr_img1 Classifieds tlbr_img2 Videos tlbr_img3 Premium tlbr_img4 E-Paper tlbr_img5 Shorts