OpenAI CEO Sam Altman warns ChatGPT users: AI isn’t covered by doctor-patient confidentiality.
Altman spoke on Theo Von’s podcast, calling out the legal gaps around privacy for sensitive talks with AI. Therapy chats? Relationship problems? Life coaching? They have zero legal protections so far.
The issue started with no clear rules for AI conversations. Unlike talking to a therapist or lawyer, ChatGPT chats can be subpoenaed. If OpenAI gets hit with a legal order, it must hand over those private conversations.
Altman said this could scare users away, especially since OpenAI is already fighting a court order. The New York Times demands OpenAI to save and share chats from hundreds of millions of users — excluding ChatGPT Enterprise.
OpenAI calls the request “an overreach” and is appealing. But the threat is real: user data could be forced out for lawsuits or law enforcement.
The problem worsened after Roe v. Wade was overturned. People turned to encrypted apps like Apple Health to protect personal info. Altman advises users to hold off on sharing sensitive info with ChatGPT until privacy laws catch up.
Sam Altman stated on the podcast:
“People talk about the most personal sh** in their lives to ChatGPT.”
“People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’”
“Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”
“I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago.”
“I think it makes sense … to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity.”
Watch the full episode here.