
A lot of people talk to ChatGPT like it’s their best friend, therapist, or life coach. But here’s the thing—it’s not. And OpenAI CEO Sam Altman says you should really think twice before sharing personal stuff with it.
In a podcast with comedian Theo Von, Altman got real. He said people, especially young ones, often open up to ChatGPT about emotional or sensitive issues. But unlike real doctors or therapists, AI doesn’t offer any legal protection.

“People talk about the most personal sh** in their lives to ChatGPT,” Altman said.
That’s not just a comment—it’s a warning.
No Legal Safety Net
When you speak to a therapist, those conversations are protected by law. But with ChatGPT, they’re not. If a court asks for your chat data, OpenAI might have to hand it over. That could mean your most private thoughts are not private at all.
Altman called it out himself.
“We should have the same concept of privacy for your conversations with AI that we do with a therapist,” he said.
Right now, that kind of privacy doesn’t exist. And that’s a big deal.
Also Read Sam Altman Confirms GPT-5 Launch This Summer, Considers Ads on ChatGPT… With a Catch
Why It Matters
People often use ChatGPT for support—relationship advice, family issues, even mental health talks. But Altman admits this legal grey area might stop users from opening up. And that’s a real concern.
OpenAI is already facing pressure. It’s appealing a court order in a legal fight with The New York Times. That case could force OpenAI to store and share user data. Millions of users could be affected.
Important note: this does not apply to ChatGPT Enterprise users, who get stronger privacy protection.
Big Tech and Your Secrets
Tech companies, including OpenAI, get asked to share user data all the time. Whether for legal or police reasons, they often have to comply.
Altman even pointed out how people got more serious about privacy after the Roe v. Wade ruling. Many switched to encrypted apps to keep their health data safe. It’s the same idea here. Be careful with what you share—especially in AI chats.
ChatGPT is a tool. It’s smart, helpful, and fun to talk to. But it’s not a doctor, a therapist, or your best friend. And unless the laws catch up, it’s better to stay cautious.
Don’t treat AI like a diary. Sam Altman himself says so.