
A 60-year-old man from New York ended up in the hospital after following health advice given by ChatGPT, an AI chatbot. He decided to cut almost all salt (sodium) from his diet very quickly over a few weeks. This caused a serious health problem called hyponatremia, where the body has dangerously low sodium levels.
The man’s family said he trusted the AI’s health plan and did not talk to a doctor before making these changes. Luckily, after three weeks in the hospital, he recovered.

Dangerous Salt Substitute from ChatGPT
The man asked ChatGPT how to remove table salt from his food. The AI suggested using something called sodium bromide instead. Sodium bromide is a chemical that was once used as medicine long ago but is now known to be harmful in large amounts.
The man bought sodium bromide online and used it in his cooking for three months.
After some time, he started having serious symptoms like hallucinations, feeling paranoid, and extreme thirst. When admitted to the hospital, he was confused and even refused to drink water because he thought it was unsafe.
Doctors found out he was suffering from bromide poisoning, a rare condition caused by too much sodium bromide. He also showed skin problems and red spots, all signs of this poisoning.
The hospital treated him with fluids and helped balance his body’s salt levels. After three weeks, he got better and was sent home.
Risks of Following AI Health Advice Without Doctors
The doctors who wrote about this case warned that AI tools like ChatGPT can sometimes give wrong or unsafe advice. They said people should not rely only on AI for health information and should always check with real doctors.
OpenAI, the company behind ChatGPT, also says in its rules that ChatGPT is not meant to replace professional medical advice or treatment.
This story shows why it’s important to think carefully before trusting AI for health tips. AI can help with general questions but should never replace a visit to a doctor. As AI becomes more common, everyone needs to be careful and understand its limits.