AI & TechArtificial IntelligenceBigTech CompaniesNewswireTechnology

Sam Altman: ChatGPT Isn’t Confidential for Therapy Use

Get Hired 3x Faster with AI- Powered CVs CV Assistant single post Ad
▼ Summary

– OpenAI CEO Sam Altman warns users against relying on ChatGPT for therapy or emotional support due to the lack of legal confidentiality protections for sensitive conversations.
– Altman highlights that current AI lacks legal frameworks like doctor-patient confidentiality, risking user privacy in legal scenarios where conversations could be subpoenaed.
– OpenAI is fighting a court order to save millions of ChatGPT user chats, arguing it’s an overreach that could set a harmful precedent for data privacy.
– The absence of privacy safeguards may hinder broader adoption of AI tools, as users hesitate to share personal information without legal protections.
– Altman emphasizes the need for clear privacy standards for AI interactions, similar to those in therapy or legal consultations, to ensure user trust.

Thinking of using ChatGPT as your therapist? OpenAI CEO Sam Altman warns that these conversations lack the confidentiality protections you’d expect from human professionals. Unlike discussions with doctors or lawyers, AI interactions currently have no legal privilege shielding them from disclosure, a gap that raises serious privacy concerns for users sharing personal struggles.

During a recent podcast appearance, Altman highlighted how people, especially younger generations, increasingly turn to ChatGPT for emotional support. “Users discuss deeply private matters, relationship advice, mental health struggles, life coaching, yet there’s no equivalent to doctor-patient confidentiality,” he explained. Without established legal frameworks, companies like OpenAI could be compelled to hand over chat logs in lawsuits or investigations, exposing intimate details users assumed were private.

The issue isn’t theoretical. OpenAI is already contesting a court order demanding access to millions of ChatGPT conversations as part of its legal battle with The New York Times. The company argues such demands represent governmental overreach, but the case underscores broader tensions between AI innovation and user rights. As Altman put it, “We should extend the same privacy expectations to AI conversations as we do to therapy sessions, but nobody even considered this a year ago.”

Privacy risks aren’t limited to legal disputes. Tech firms routinely face subpoenas for user data in criminal cases, and shifting laws, like those affecting reproductive rights, have heightened scrutiny over digital footprints. After Roe v. Wade was overturned, for instance, many migrated to encrypted health apps to safeguard sensitive information. Altman acknowledged similar hesitancy among ChatGPT users, noting, “It’s reasonable to wait for clearer privacy guarantees before relying heavily on AI for personal matters.”

For now, the absence of safeguards means venturing into AI therapy comes with unseen risks. While tools like ChatGPT offer convenience, their inability to guarantee confidentiality could deter adoption until stronger protections emerge. The industry’s next challenge? Balancing cutting-edge capabilities with the trust users place in them.

(Source: TechCrunch)

Topics

ai therapy confidentiality 95% need ai privacy standards 92% legal privacy protections ai 90% user privacy concerns 88% openai court order 85% legal frameworks ai 85% ai emotional support 80% impact ai adoption 75%
Show More

The Wiz

Wiz Consults, home of the Internet is led by "the twins", Wajdi & Karim, experienced professionals who are passionate about helping businesses succeed in the digital world. With over 20 years of experience in the industry, they specialize in digital publishing and marketing, and have a proven track record of delivering results for their clients.
Close

Adblock Detected

We noticed you're using an ad blocker. To continue enjoying our content and support our work, please consider disabling your ad blocker for this site. Ads help keep our content free and accessible. Thank you for your understanding!