ChatGPT Isn’t Your Therapist: OpenAI Warns Your Conversations Aren’t Legally Private
OpenAI CEO Sam Altman warns ChatGPT chats aren’t legally private like therapy or legal talks. Sensitive info shared may be disclosed in court without confidentiality protections.

Telling Secrets to ChatGPT? Your AI Chats Aren’t Legally Private, Warns Sam Altman
OpenAI CEO Sam Altman has highlighted a critical privacy gap in how conversations with ChatGPT are treated under the law. Unlike discussions with therapists, doctors, or lawyers, chats with AI lack legal confidentiality protections. This means that sensitive information shared with ChatGPT could potentially be disclosed in legal proceedings.
ChatGPT Conversations Lack Legal Privilege
Many users treat ChatGPT as a trusted confidant—seeking advice on relationships, sharing emotional struggles, or looking for guidance during tough times. However, Altman cautions that these interactions are not protected by the same legal privileges that apply in professional settings like therapy or legal counsel.
In a podcast appearance with comedian Theo Von, Altman explained that the absence of doctor-patient or attorney-client confidentiality for AI chats means OpenAI could be compelled to provide user conversations in court cases. He described the situation as “very screwed up” and called for urgent legal reforms to address this privacy gap.
Need for Clear Privacy Regulations in AI
Altman argues that privacy standards for AI interactions should eventually match those of human professionals. The rapid rise of generative AI has outpaced existing laws, creating serious legal and ethical challenges.
The lack of clear privacy rules can deter users from fully benefiting from AI tools. Altman acknowledged that many, including Von, hesitate to use ChatGPT extensively without assurance about their data privacy. “It makes sense to want privacy clarity before you use it a lot,” Altman said.
Data Access and Retention Policies
- OpenAI retains conversations from free-tier users for up to 30 days for safety and system improvements.
- Chats may be stored longer for legal reasons.
- Conversations are not end-to-end encrypted like messaging apps such as WhatsApp or Signal.
- OpenAI staff may review user inputs to improve AI or detect misuse.
These policies mean that user chats are accessible and not fully private by design.
Legal Battles Highlight Privacy Concerns
OpenAI is currently involved in a lawsuit with The New York Times that has brought its data storage and disclosure practices under scrutiny. A court order reportedly requires OpenAI to retain and potentially produce user conversations, except those from ChatGPT Enterprise customers. OpenAI is appealing the order, calling it an overreach.
Broader Implications for AI and Data Rights
Altman noted that demands for user data by law enforcement and courts are increasing across the tech industry. He compared this to how people shifted to encrypted health apps after the U.S. Supreme Court’s Roe v. Wade decision raised concerns about digital privacy for personal choices.
Until legal protections catch up, users should exercise caution when sharing sensitive or confidential information with AI chatbots like ChatGPT.
Legal professionals and organizations should monitor these developments closely as privacy standards for AI tools evolve. For those interested in AI and its legal implications, exploring courses on AI ethics, data privacy, and regulatory compliance can provide valuable insight. Check out resources at Complete AI Training to stay informed on AI-related legal standards and best practices.