Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

Is Your AI Therapist a Mole for the Surveillance State?

5/16/2025

 

“It’s Delusional Not to be Paranoid”

Picture
​With few exceptions, conversations with mental health professionals are protected as privileged (and therefore private) communication.
 
Unless your therapist is a chatbot. In that case, conversations are no more sacrosanct than a web search or any other AI chat log; with a warrant, law enforcement can access them for specific investigations. And of course, agencies like the NSA don’t even feel compelled to bother with the warrant part.
 
And if you think you’re protected by encryption, think again says Adi Robertson in The Verge. Chatting with friends using encrypted apps is one thing. Chatting with an AI on a major platform doesn’t protect you from algorithms that are designed to alert the company to sensitive topics.
 
In the current age of endless fascination with AI, asks Robertson, what would prevent any government agency from redefining what constitutes “sensitive” based on politics alone? Broach the wrong topics with your chatbot therapist and you might discover that someone has leaked your conversation to social media for public shaming. Or perhaps a 4 a.m. knock on the door with a battering ram by the FBI.
 
Chatbots aren’t truly private any more than email is. Recall the conventional wisdom from the 1990s that advised people to think of electronic communication as the equivalent of a postcard. If you wouldn’t want to write something on a postcard for fear of it being discovered, then it shouldn’t go in an email – or in this case, a chat. We would all do well to heed Adi Robertson’s admonition that when it comes to privacy, we have an alarming level of learned helplessness.
 
“The private and personal nature of chatbots makes them a massive, emerging privacy threat … At a certain point, it’s delusional not to be paranoid.”
 
But there’s another key difference between AI therapists and carbon-based ones: AI therapists aren’t real. They are merely a way for profit-driven companies to learn more about us. Yes, Virginia, they’re in it for the money. To quote Zuckerberg himself, “As the personalization loop kicks in and the AI starts to get to know you better and better, that will just be really compelling.” And anyone who thinks compelling isn’t code for profitable in that sentence should consider getting a therapist.
 
A real one.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Comments are closed.

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Digital Privacy
    Domestic Surveillance
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2024. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank