Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

Watching the Watchers: On Its Own, AI Isn’t Watching, Or Thinking

9/2/2025

 
Picture
Image: Citizen website.
Joseph Cox of 404 Media reminds us of three things that we know to be true about the new era of generative artificial intelligence:

  1. AI isn’t a substitute for people.
  2. AI isn’t a substitute for people.
  3. AI isn’t… well, you get the picture.

As we’ve written before, AI works best when there’s a human in the loop. Take the case of Citizen.com, whose app is increasingly taking an AI-only approach to crime fighting. Because, really, what could possibly go wrong?

Plenty, as you can imagine. Without further ado, here’s 404 Media’s report on what happens when AI is left to its own devices, Citizen-style. It is prone to:

  • Mistranslating “motor vehicle accident” as “murder vehicle accident.”
 
  • Misinterpreting addresses.
 
  • Publishing incorrect locations.
 
  • Adding gory or sensitive details that violate Citizen’s guidelines.
 
  • Sending notifications about police officers spotting a stolen vehicle or homicide suspect, potentially putting operations at risk.
 
  • Writing alerts as if officers had already arrived on the scene, when in fact the dispatcher was only providing supplemental information while officers were en route.
 
  • Duplicating incidents, failing to recognize that two pieces of dispatch audio are related to the same singular event.
 
  • This was especially common with police chases, where dispatch continually provided new addresses. The “AI would just go nuts and enter something at every address it would get and we would sometimes have 5-10 incidents clustered on the app that all pertain to the same thing,” one source said.
 
  • Omitting important details, such as whether a person was armed with a weapon.
​
The stakes are as strategic as they are tactical. One of Cox’s sources told him, “This could skew the perception of crime in a particular area,” as AI-created incidents proliferated.
 
By the way, the original name of Citizen – both the app and the company – was, perhaps tellingly, Vigilante. But that’s a story for another day.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Comments are closed.

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Data Privacy
    Digital Privacy
    Domestic Surveillance
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    The White House
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2024. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank