counter Wait, is ChatGPT really banning health advice? Here’s what’s actually happening – Forsething

Wait, is ChatGPT really banning health advice? Here’s what’s actually happening

If you’ve ever asked ChatGPT to diagnose your weird rash or write your will, you might feel unnerved at this news. As of late October, OpenAI has officially told its chatbot to stop giving out health advice.

Credit: Canva

Instead of dishing out tailored advice, ChatGPT will now only explain general principles and remind you to see a real professional. The new policies say users can’t look to OpenAI’s services for “provision of tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional”.

According to the NEXTA, OpenAI reclassified ChatGPT as an “educational tool”, not a consultant. Why? Because regulations — and fear of lawsuits — are catching up. Big Tech doesn’t want to be blamed if someone gets hurt following AI advice.

That’s probably for the best. Around one in six people use ChatGPT for health advice every month, according to a 2024 KFF survey. In the past, people treated it like a virtual doctor — typing in symptoms and getting everything from caffeine withdrawal to brain tumor as possible causes. These new limits are meant to stop that panic spiral.

Credit: Canva

There have also been real-world consequences. One man reportedly developed a psychiatric condition after ChatGPT suggested replacing table salt with sodium bromide — a toxic compound — according to The Annals of Internal Medicine. And OpenAI itself admitted in August that earlier models “fell short in recognizing signs of delusion or emotional dependency”.

The same goes for financial or legal advice. ChatGPT can explain what a 401(k) is or what “power of attorney” means, but it can’t tell you which investments to make or whether your will is valid. Sharing sensitive info like salaries or medical histories with an AI also raises privacy flags — that data could, in theory, be reused for training.

So no, ChatGPT isn’t banning health information — it’s just backing off health advice. You can still ask what serotonin does; it just won’t tell you how to fix yours.

For more like this, like The Tab on Facebook.

Featured image credit: Canva

About admin