OpenAI CEO Sam Altman Warns: ChatGPT Is Not a Therapist, User Conversations Not Legally Private

Sam Altman warning about chatgpt

July 28, 2025

OpenAI CEO Sam Altman has issued a cautionary statement regarding the use of ChatGPT as a therapeutic tool, warning users that conversations with the AI chatbot do not carry legal confidentiality and may be subject to disclosure in legal proceedings.

Speaking during an appearance on This Past Weekend podcast hosted by Theo Von, Altman addressed growing concerns over users sharing sensitive personal information with ChatGPT. He emphasized that while many people have begun using the tool for therapy-like support, such interactions are not protected by any form of legal privilege similar to those that govern doctor-patient or attorney-client relationships.

“People are treating ChatGPT like a therapist. But legally, it’s nothing like that,” Altman said. “If there’s a lawsuit, those chats can be used as evidence.”

The warning comes amid an ongoing legal battle between OpenAI and The New York Times, in which the media organization is seeking the preservation of user conversations as part of a copyright infringement lawsuit. OpenAI is currently contesting a court order requiring it to retain all user chats, including those that users have deleted, citing concerns over user privacy and data protection.

Altman described the court’s demand as an “overreach,” and voiced strong support for the development of privacy protections for AI communication. He argued that the legal system has not yet caught up with the rapid adoption of generative AI and lacks frameworks to safeguard private conversations between users and AI systems.

According to OpenAI’s current policy, chats from free, Plus, and Pro accounts are stored temporarily and deleted after 30 days—unless required for legal, security, or abuse-monitoring purposes. However, Altman warned that this deletion policy could be overridden by a legal mandate, especially if courts begin treating AI chats as discoverable evidence in lawsuits.

“It’s crazy that a conversation you thought was private could suddenly become part of a legal case,” Altman added. “We need new rules—fast.”

Altman urged regulators and lawmakers to create clear legal standards that protect AI-based interactions, noting the potential mental health impact on users who believe they are engaging with a confidential system.

His remarks underscore a growing tension between the convenience and emotional comfort users derive from generative AI tools and the current lack of legal protections surrounding such technology.

As ChatGPT and similar platforms continue to evolve into personal aides and emotional support tools, the issue of AI privacy rights has gained new urgency. Until formal legal confidentiality for AI interactions is established, users are being advised to exercise caution when disclosing personal or sensitive information in chatbot conversations.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
“5 Best Forts Near Pune to Visit on Shivjayanti 2026” 7 facts about Dhanteras