
OpenAI CEO Sam Altman has cautioned users against relying on ChatGPT for therapy or emotional support, citing the absence of legal confidentiality protections when using artificial intelligence for personal matters.
During a podcast, Altman highlighted a significant gap in current legal frameworks regarding AI interactions.
He noted that while traditional relationships with doctors, therapists, or lawyers are protected under legal privilege, no such confidentiality applies when engaging with AI systems like ChatGPT.
“People talk about the most personal stuff in their lives to ChatGPT,” Altman said during the podcast. “Young people, especially, use it as a therapist or life coach, asking things like ‘what should I do’ in their relationships. But unlike human professionals, there’s no legal protection around that information.”
Altman expressed concern that this lack of privacy could pose risks for users. In legal proceedings, he warned, companies like OpenAI could be compelled to produce user chat data, exposing deeply personal information.
“I think that’s very screwed up,” he said. “We should have the same concept of privacy for your conversations with AI that we do with a therapist or a lawyer.”
The comments come amid growing scrutiny over how AI firms handle user data, especially as models like ChatGPT become more embedded in daily life. While OpenAI has taken steps to protect privacy — including offering more secure services for enterprise users — it is currently embroiled in a legal battle with The New York Times. A recent court order, which OpenAI is appealing, would require the company to preserve and potentially produce user conversations for legal discovery.
On its website, OpenAI described the order as “an overreach,” warning it could set a dangerous precedent allowing broader demands for user data in legal or law enforcement contexts.
Altman noted that current legal uncertainty is affecting public trust in AI platforms, particularly around sensitive topics. He acknowledged podcast host Theo Von’s concerns about privacy, saying, “I think it makes sense to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity.”
The issue reflects broader societal debates around digital privacy in the wake of shifting legal landscapes, including the U.S. Supreme Court’s decision to overturn Roe v. Wade. That ruling prompted many users to migrate to encrypted apps or services for health data management.