TrendingVideosIndia
Opinions | CommentEditorialsThe MiddleLetters to the EditorReflections
UPSC | Exam ScheduleExam Mentor
State | Himachal PradeshPunjabJammu & KashmirHaryanaChhattisgarhMadhya PradeshRajasthanUttarakhandUttar Pradesh
City | ChandigarhAmritsarJalandharLudhianaDelhiPatialaBathindaShaharnama
World | ChinaUnited StatesPakistan
Diaspora
Features | The Tribune ScienceTime CapsuleSpectrumIn-DepthTravelFood
Business | My MoneyAutoZone
News Columns | Straight DriveCanada CallingLondon LetterKashmir AngleJammu JournalInside the CapitalHimachal CallingHill View
Don't Miss
Advertisement

AI gets it wrong: ChatGPT diet advice lands New York man in hospital

Doctors diagnosed him with bromism
iStock

Unlock Exclusive Insights with The Tribune Premium

Take your experience further with Premium access. Thought-provoking Opinions, Expert Analysis, In-depth Insights and other Member Only Benefits
Yearly Premium ₹999 ₹349/Year
Yearly Premium $49 $24.99/Year
Advertisement

In a stark warning about the potential dangers of blindly following AI-generated health advice, a 60-yeard-old New York man was hospitalised with severe hallucinations and a rare form of poisoning.

Advertisement

The man had reportedly replaced sodium chloride (common salt) after reading about its negative health effects in his diet with sodium bromide – commonly used for industrial and cleaning purposes — after taking dietary recommendations from chatGPT.

Advertisement

Soon, he began developing symptoms like confusion, hallucinations and eventually psychosis. Doctors diagnosed him with bromism.

The man had no past psychiatric or medical history, but during the first 24 hours of his hospitalisation, he expressed increased paranoia and auditory and visual hallucinations. Apparently, he was very thirsty but paranoid about water he was offered.

The man was treated with fluids and electrolytes and became medically stable, allowing him to be admitted to the hospital’s inpatient psychiatry unit. He spent three weeks in the hospital before he was well enough to be discharged.

Advertisement

The incident underscores the critical need for professional medical consultation before making significant changes to one’s diet based on online information.

Advertisement
Tags :
#AIHealthAdvice#Bromism#ChatGPTHealth#DietaryChanges#HealthScare#MedicalMisinformation#OnlineHealthRisks#SeekMedicalAdvice#SodiumBromidePoisoninghallucinations
Show comments
Advertisement