Man Follows ChatGPT’s Medical Advice, Gets Poisoned: Why AI Users Must Double Check Generated Outputs
A man who followed ChatGPT’s advice to replace common table salt with sodium bromide ended up in the hospital with a rare case of “bromism”, a toxic condition that was common a century ago but is now virtually unheard of. This story exposes the very real dangers of trusting AI tools as infallible sources for medical decisions or any information and this article will reveal how this can happen with not just medical advice, but all advice from chatbots.