EntertainmentNewsScience & Tech

Sam Altman Warns – ChatGPT Conversations Lack Legal Privacy Protections

Sam Altman Warns – ChatGPT Conversations Lack Legal Privacy Protections

OpenAI CEO Sam Altman has raised alarms about the lack of legal privilege for conversations with ChatGPT, warning that personal and sensitive information shared with the AI could be used as evidence in court. Speaking on with comedian Theo Von 3 days ago. After Theo Von started with this question “What does that look like when you think about AI with so much new information coming online, right, and so much data being collected and information being carpooled and maybe which is the term? “.  after having some conversation around it, he asked.  ” What legal system does AI have to work by ? is there like a legal – laws like we have laws in the human world , does AI have to work by any legal laws ?.

Altman highlighted the risks of users treating ChatGPT as a confidant and called for urgent action to address this privacy gap. Here’s what this means for AI users, particularly those sharing intimate details

“I think we will certainly need a legal or a policy framework for AI . One example that we have been talking about a lot maybe or not quite what you are asking , this is like a very human centric version of that question – people talk about the most personal shit in their lives to ChatGPT , especially young people use it as a therapist, a life coach” He stated that many seek advice on relationship problems or other sensitive issues , asking “What should I do” . He made it clear in the interview that if a person talk to a lawyer, a doctor or a therapist about these problems they might be facing as human that there is legal privilege for it , like; doctor-patient confidentiality, legal confidentiality” And we haven’t figured that out yet for when you talk to ChatGPT,” Altman explained.

This legal grey area poses serious risks. “If you go talk to ChatGPT about the most sensitive stuff and then there’s a lawsuit or whatever, we could be required to produce that,” he warned, adding, “If someone confides their most personal issues to ChatGPT, and that ends up in legal proceedings, we could be compelled to hand that over. And that’s a real problem.” The potential for private conversations to be exposed in court underscores a critical gap in AI privacy laws.

A Call for Change
Altman stressed the need for immediate action, comparing AI conversations to those with therapists. “I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever, and no one had to think about this even a year ago, and now I think it’s this huge issue of how we are going to treat the laws around this” he concluded.

His remarks align with growing concerns about AI ethics, as users increasingly rely on tools like ChatGPT for emotional support reflect similar worries, with users debating whether AI companies should prioritize data privacy.

With ChatGPT boasting over 200 million weekly active users as of June 2025 , its role as a conversational tool extends beyond casual queries to deeply personal interactions. The absence of legal protections could deter users from seeking AI support or expose them to unintended consequences in legal settings. Altman’s call for privacy reforms highlights the need for updated laws as AI integrates further into daily life.

What do you think about this privacy gap? Share your thoughts in the comments – Tell Us Your Mind

You can watch the podcast here

 

Meta Unveils Futuristic Wrist Tech That Lets You Control Devices with Just a Hand Gesture

All rights reserved. The materials on this website are protected and may not be copied for use elsewhere. TRENDTHEORIES. Learn more.

Thanks for staying and reading. Stay updated with the latest news – the truth behind every headline.
Join Trend Theories Channel Follow Us on X
Read Also

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

🔖 Ads Disclaimer

🛡 Copyright Infringement