Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

AI, Facecrime, and the Growing Risk of Emotional Surveillance

1/27/2026

 
Picture
​Are you having a good day! I certainly am! When I got to work this morning I could barely contain my excitement at seeing such a full inbox of wonderful things to do! I swear, at times it seems almost criminal to accept pay for doing work I love so much!

[Smile in the direction of the workplace surveillance camera.]

Anyway, I’d love to join you in the breakroom, but I really can’t wait to get back to my workstation! Toodles!

Artificial intelligence is getting better at reading human emotion. It is used by commercial technology to perform “sentiment analysis,” reading the emotional tone of written communications – a valuable tool for HR departments, advertisers, and customer-engagement consultants.

The next bold step is already at the threshold: AI that can read emotions in our voices, the fleeting micro-expressions on our faces, and our body language. This technology will certainly expand into policing, hiring, and education. Are you acting guilty? Did you hide something in your job interview? Are you bored by the teacher’s lecture?

As biometric corridors become commonplace in U.S. airports, AI is being tested to read facial expressions and body language that could identify potential terrorists – based on the tidy theory that people who plan to blow themselves up at 35,000 feet tend to be nervous. But so are people who are running late for their connection, who just had an argument with a spouse, got fired, or are jet-lagged.

Emine Akar in a blog for the Institute for the Future of Work enumerated the potential pitfalls of emotional surveillance: “Emotions are not simply reflexes. They are complex, contextual, and culturally shaped experiences. A tear can mean grief, joy, manipulation, or even boredom.”

The other risk is that AI, which improves by the day, will read our emotions all too well. Pervasive emotional surveillance may force us to put on a happy face at work, school, and the airport. To frown may be to risk detention, detainment, or delay. We could even risk committing “facecrime,” to name just one of the clever neologisms of George Orwell’s 1984.

That novel’s protagonist, Winston Smith, was well acquainted with facecrime. One had to always have an expression of love when watching Big Brother on the telescreen. One had to have an expression of rage when engaging in the mandatory two minutes of hate. Smith knew that the “smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide.”

When we allow machines to read our emotions, we risk giving them power over us. “The danger here is not just that machines fail to understand us,” Akar wrote. “It’s that they may begin to discipline us – nudging our expressions, altering our behavior, shaping our emotional lives in invisible ways.”

This kind emotional manipulation was well captured in the movie Her, in which a man falls in love with an AI (not hard to do when the voice belongs to Scarlett Johansson). Pope Leo XIV is not being prescient – he is simply being current – when he warned us over the weekend about getting involved with “overly affectionate” chatbots, lest they become “hidden architects of our emotional states.”
​
We need to be more concerned about the implications of emotionless minds that can read, exploit, and manipulate our emotions. The European Union’s AI Act is one example of how to restrict emotional surveillance at school, work, and other sensitive areas. It is time for Congress, states, and technology leaders to put proper guardrails on emotional surveillance of Americans as well.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Comments are closed.

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Biometric Data
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Data Privacy
    Digital Privacy
    Domestic Surveillance
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    The White House
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2026. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank