Sam Altman Highlights Privacy Gaps in Using ChatGPT for Therapy

Why ChatGPT Isn’t Legally Confidential—Sam Altman’s Warning for Users Seeking Therapy Online
As artificial intelligence becomes increasingly integrated into our daily lives, more users are turning to AI platforms like ChatGPT for therapy and emotional support. However, a recent statement from OpenAI CEO Sam Altman sheds light on critical privacy concerns that anyone considering this use case should understand.
No Doctor-Patient Confidentiality with AI
On a recent episode of Theo Von’s podcast, Altman discussed how the current legal landscape fails to protect conversations with AI in the same way it does for human therapists, lawyers, or doctors. He explained that while people often share personal details and seek advice from ChatGPT, there is no legal privilege or confidentiality protecting those conversations.
- Traditional therapy sessions are protected by doctor-patient confidentiality laws.
- Legal consultations have their own confidentiality protections.
- Currently, AI chats do not enjoy these same legal safeguards.
Potential Risks of Using AI for Sensitive Topics
The absence of confidentiality means that, if required by a court or during a lawsuit, OpenAI may be legally compelled to produce user conversation data. Altman described this situation as concerning, emphasizing the need for privacy norms that match those of human professionals:
- Your conversations with ChatGPT could be accessed in legal proceedings.
- There is no established policy or legal framework to prevent this yet.
Growing Legal and Privacy Tensions
OpenAI has already faced demands to produce chat data in ongoing legal cases. For instance, the company is appealing a court order in its lawsuit with The New York Times, which would require the preservation of user chats globally (excluding enterprise customers). OpenAI describes such orders as "overreach" and warns they could set a precedent for further demands from law enforcement or in legal discovery.
Broader Implications for Digital Privacy
This issue is not limited to AI chatbots. After significant legal changes, such as the overturning of Roe v. Wade in the US, many users switched to more privacy-focused health apps to protect sensitive data. The trend reflects broader concerns about how digital data—including AI chat logs—can be subpoenaed or otherwise accessed by third parties.
What Should Users Do?
If you’re considering using ChatGPT for therapy or personal advice, be aware:
- There is currently no legal confidentiality protecting your conversations.
- Exercise caution when sharing sensitive or private information with AI tools.
- Stay updated on privacy policies and legal developments related to AI.