Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

Coming Soon: Cars that Decide If You Should Drive

3/30/2026

 

“Think Minority Report, But for Your Morning Commute”

Picture
“Zero crash fatalities” was the way some advocates touted the vehicle safety mandates authorized by the infrastructure package that Joe Biden signed in 2021. As admirable as such goals sound, the mandates are an ill-conceived, undefined approach that, from a privacy standpoint, has more holes in them than a cocktail strainer.

Now, three years past its original deadline, the NHTSA is barreling ahead with a model-year 2027 implementation while still not having posted a draft rule. The possible design architecture is a nightmare – including AI-powered infrared cameras that actively monitor biometrics (e.g., pupil dilation) to determine whether a driver is “impaired.”

“Your car simply watches and decides whether you’re fit to drive,” Gadget Review contributor C. Da Costa writes – “Think Minority Report, but for your morning commute.”

Unlike drunk driving laws that already exist and work, warns Lauren Fix, the vagueness of these mandates takes them beyond traditional constitutional safeguards:

“No breath test is required. No police officer is involved. The judgment is made by software. Once flagged, the vehicle can refuse to start or restrict operation – and here is the critical issue: there are no federal rules defining how a driver gets out of that lockout. No required appeal process. No mandated reset timeline. No human review. Drivers are placed into what critics now call ‘kill switch jail,’ with no clear exit. This is not targeted enforcement. It applies to every driver, every time, regardless of driving history.”

“Advanced impaired driving prevention technology” (in the words of the original mandate) seems unlikely to work as advertised. Instead of saving perhaps 10,000 lives annually, it will merely make already too-expensive vehicles even more expensive as reluctant manufacturers pass these costs on to consumers.
​
From a privacy standpoint, it will create a massive public-private database of biometric data that will be the envy of government agents and hackers alike. In doing so, it will permanently end one of the few remaining bastions of American personal freedom, and one that is already under serious threat – the privacy we enjoy behind the wheel.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

How to Hide Your Heartbeat

3/24/2026

 
Picture
​Researchers at Rice University have worked out how to camouflage your heartbeats from unwanted surveillance with “biometric decoys.” Wait, what? Excuse me, you ask, why might I soon want to camouflage my heartbeat?

Remote heart rate monitoring is just one of many threats to privacy emerging from the mushrooming field of biometric tracking. This common, everyday technology ranges from radar-based imaging used for facial authentication to wearables that monitor signals like heart rate variability, respiration, temperature, steps, calories ingested, and the quality of your sleep cycles. Biometric tracking is designed to make everyday life safer and easier, telling you how much of your last night was spent in deep, light, and REM sleep, or whether your heartbeat is showing signs of arrhythmia.

In today’s world, however, no good data feed goes unexploited. Off-the-shelf devices such as millimeter-wave radars can be used to eavesdrop on phone conversations and monitor daily movement patterns. They can also be used to monitor subtler signals like breathing and heart rate to gauge your stress, activity, or emotional state.

“Sensing technologies are becoming higher resolution and more pervasive, and concerns around what that means for privacy should be taken seriously,” said Edward Knightly, the senior researcher on the study. “It is important to explore potential vulnerabilities and think about how we might address them.”
​

Despite the benefits of biometric monitoring, as with almost all new technologies it comes with a privacy downside. Without policy or legal guardrails, employers might soon monitor your heart rate as soon as you log into your work computer. Or imagine how a negotiator might exploit the knowledge that the person on the other side of the table had a terrible night’s sleep.

The complete study was published in the journal Computer Communications via ScienceDirect. And none too soon, given that the market for biometric systems (and their highly desirable data) is expected to roughly double between now and 2030. So it is not too early to worry about such things – as technology can change in a heartbeat.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

New York City Debates Biometric Ban in Businesses – “You Can Cancel a Credit Card But You Cannot Cancel Your Face”

3/23/2026

 
Picture
After the end of the pandemic, retail theft became rampant in New York City, as it did in San Francisco, Los Angeles, and elsewhere.

Retail theft has evolved into a multibillion-dollar industry for highly organized criminal gangs. Last year, Queens District Attorney Melinda Katz charged a theft ring with hitting Home Depot outlets up to four times a day, only taking breaks from larceny for team lunches. New York Gov. Kathy Hochul said that the state, after toughening laws and putting money behind enforcement, had driven down retail theft crimes in New York City and the state with double-digit reductions.

Yet retail theft continues to eat away at the profits of stores, from big chains to mom-and-pop shops. It is understandable that businesses would turn to biometric identifiers to spot serial offenders and block them before they can enter a store.

But there is a cost to such surveillance – one that we all pay.

“Many of us know the feeling of discovering our credit card information has been stolen,” said New York Councilmember Shahana Hanif. “It’s invasive and frightening, but you can cancel a credit card and get a new one. You cannot cancel your face. You cannot cancel your iris.”

Hanif is sponsoring legislation that would prohibit biometric identifying technology in “public accommodation” spaces such as concerts and grocery stores. (Hat tip to Liam Quigley of Gothamist.)

The city already requires stores to post notice to customers that they collect biometric data. Is this a simple case of caveat emptor? Or is the better question: should we give up our privacy just to buy groceries?

There is more at stake than just what store managers see. It is what happens to this biometric data after it is collected. Hanif’s legislation would stop businesses from selling, leasing, or trading biometric data for profit. It would also require written consent from customers who wish to share their data, including in stores where biometrics are accepted for payment.
​
At the very least, protecting our biometric data – and blocking its sale to other businesses, as well as preventing it from being sold or given to government agencies – would be a reasonable guardrail for New York City and other municipalities to adopt.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

AI, Facecrime, and the Growing Risk of Emotional Surveillance

1/27/2026

 
Picture
​Are you having a good day! I certainly am! When I got to work this morning I could barely contain my excitement at seeing such a full inbox of wonderful things to do! I swear, at times it seems almost criminal to accept pay for doing work I love so much!

[Smile in the direction of the workplace surveillance camera.]

Anyway, I’d love to join you in the breakroom, but I really can’t wait to get back to my workstation! Toodles!

Artificial intelligence is getting better at reading human emotion. It is used by commercial technology to perform “sentiment analysis,” reading the emotional tone of written communications – a valuable tool for HR departments, advertisers, and customer-engagement consultants.

The next bold step is already at the threshold: AI that can read emotions in our voices, the fleeting micro-expressions on our faces, and our body language. This technology will certainly expand into policing, hiring, and education. Are you acting guilty? Did you hide something in your job interview? Are you bored by the teacher’s lecture?

As biometric corridors become commonplace in U.S. airports, AI is being tested to read facial expressions and body language that could identify potential terrorists – based on the tidy theory that people who plan to blow themselves up at 35,000 feet tend to be nervous. But so are people who are running late for their connection, who just had an argument with a spouse, got fired, or are jet-lagged.

Emine Akar in a blog for the Institute for the Future of Work enumerated the potential pitfalls of emotional surveillance: “Emotions are not simply reflexes. They are complex, contextual, and culturally shaped experiences. A tear can mean grief, joy, manipulation, or even boredom.”

The other risk is that AI, which improves by the day, will read our emotions all too well. Pervasive emotional surveillance may force us to put on a happy face at work, school, and the airport. To frown may be to risk detention, detainment, or delay. We could even risk committing “facecrime,” to name just one of the clever neologisms of George Orwell’s 1984.

That novel’s protagonist, Winston Smith, was well acquainted with facecrime. One had to always have an expression of love when watching Big Brother on the telescreen. One had to have an expression of rage when engaging in the mandatory two minutes of hate. Smith knew that the “smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide.”

When we allow machines to read our emotions, we risk giving them power over us. “The danger here is not just that machines fail to understand us,” Akar wrote. “It’s that they may begin to discipline us – nudging our expressions, altering our behavior, shaping our emotional lives in invisible ways.”

This kind emotional manipulation was well captured in the movie Her, in which a man falls in love with an AI (not hard to do when the voice belongs to Scarlett Johansson). Pope Leo XIV is not being prescient – he is simply being current – when he warned us over the weekend about getting involved with “overly affectionate” chatbots, lest they become “hidden architects of our emotional states.”
​
We need to be more concerned about the implications of emotionless minds that can read, exploit, and manipulate our emotions. The European Union’s AI Act is one example of how to restrict emotional surveillance at school, work, and other sensitive areas. It is time for Congress, states, and technology leaders to put proper guardrails on emotional surveillance of Americans as well.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Trial Lawyers Buy Personal Medical Files to Shop for Clients

1/26/2026

 
Picture
​The next time you get a letter asking you to join a class-action lawsuit for something that is in fact relevant to you … it’s probably not a coincidence.

Epic Systems is the largest vendor of electronic health records (EHR) in the United States. A few years ago, its engineers noticed that some of its customers were behaving suspiciously. Their internal investigation revealed what they allege are “organized syndicates” that purchased records under false pretenses in order to use the data for non-treatment purposes – mostly to generate client leads for law firms.

It's all in a new federal lawsuit against Health Gorilla and its customers. This suit was filed by Epic and various healthcare partners, including UMass Memorial, as detailed by Daniel Gilbert in The Washington Post last week (paywalled story here).

Among other things, Epic’s investigation revealed that as many as thirty law firms appeared to have accessed patient records. Though no firms are named in the litigation, Epic says they don’t need to be. The suit alleges that, as gatekeeper, Health Gorilla was knowingly “in league with its connections’ misuse of health information as a commodity.”

Epic also claims that Health Gorilla’s customers went to great lengths to disguise themselves as healthcare providers to hide their true intent. These tactics included adding junk data to patient charts to “give the false impression they are treating patients.” Fictitious websites, shell companies, and the use of sham National Provider Identification numbers are cited as additional evidence of malfeasance mentioned in the complaint.

The lawsuit suggests that the schemers operate like a Hydra: “When one fraudulent entity is exposed, the bad actors birth a new one.” If Epic asked one company about unusual patterns in its records requests, submissions would abruptly stop only to be restarted by another.

As Brittany Trang of STAT News notes, the current lawsuit “raises fresh questions about how to guarantee patient records are only shared with legitimate medical providers.” Industry expert Don Rucker agrees, calling it “a fight over who controls access to clinical data and how those data are governed once they move outside the provider's EHR.”

Rucker and others point out that the HIPAA Privacy Rule – like most federal statutes on the matter – poorly defines “purpose of use,” leaving room for broad secondary categories that include, among other things, marketing.

The legitimate use of anonymized patient data is beyond dispute, especially when combined with responsible AI practices. Meta-analyses, for example, can lead to scientific breakthroughs including lifesaving treatments and cures. Anonymized data can improve quality standards and innovations in both practice and research methods.
​
In order for that to happen, HIPAA needs to be updated to protect privacy. A good first step would be for Congress to put guardrails on data brokers’ selling of Americans’ personal digital data.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Watching the Watchers: “Un-Personing People,” or How To Control a Population in Three Easy Steps

1/20/2026

 
Picture
​The ACLU’s Jay Stanley just published a critique of the increasing push by states to adopt digital ID systems. It’s his fifth admonition in as many months, and the message is more urgent than ever: the digital ID bandwagon is becoming a rush job that threatens to discard privacy guardrails.

Of the many possible pitfalls, the greatest may be the ability of authorities to “un-person” someone. In the parlance of Orwell and his novel 1984, an “unperson” simply vanishes as every last record of that person’s existence is expunged.

Stanley's version of Orwell hinges on what happens when authorities revoke an ID that exists only in digital form. In his new essay, “How to Give the Government New Power to ‘Un-Person’ Someone, in Three Easy Steps,” Stanley unmasks the underlying features of digital IDs that can be revoked at will:

  • It’s about control: “The big push for state digital driver's licenses that we’ve been warning about is effectively a movement to increase the power of big companies and government to control individuals.”
 
  • It’s about power: “The power to revoke people’s IDs, cutting off in a single stroke their ability to access their accounts, visit much of the Internet, access government services, start a new job, obtain healthcare, and who knows what else. In short, kneecap their ability to function effectively in society.”
 
  • It’s about dependency: “Make it frictionless to present an ID, which will make it easy for every business to demand your ID.”
 
  • It’s a universal ID: “Build a new digital identity system, such as digital driver’s licenses, that comes to serve as the proof of identity (and age, and residency, and other characteristics) for the vast majority of the population in the vast majority of use cases.”
 

  • It’s an at-will free-for-all: “People could be un-personed because of simple errors, because they have unfairly been accused of wrongdoing, or out of abusive targeting for political reasons.”
 
  • It’s too easy: “With digital licenses, state governments can create a system that allows them to … reach into your digital wallet in your phone and remotely deactivate your ID.”
 
  • It avoids individuality: “Don’t create protections for individuals (such as those that some states have erected around the scanning of barcodes on physical licenses).”
 
  • Put plastic and paper on the road to extinction: “Before long, physical IDs may be treated as a second class after-thought as digital IDs become de facto mandatory.”

Stanley recommends that lawmakers impose statutory limits on the revocation of state-issued IDs, along with strong due-process protections. He also recommends adding technical guardrails against abusive revocation.
​
Stanley’s original piece goes into much more detail. We also recommend GovernmentTechnology reporter Nikki Davidson’s recent interview with Stanley – it is more than worth ten minutes of your time.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

A Class-Action Lawsuit Against San Francisco Details How “Vehicle Fingerprints” Are Used in the Mass Surveillance of Drivers

1/5/2026

 
Picture
​Michael Moore is a retired public-school teacher living in San Francisco. Nearly every day, as he drives to the store, to his sons’ schools, or to meet friends and family, his movements are watched and recorded at every turn. But he is not being tailed by a private detective or by the police.

Moore, like every other driver in San Francisco, is being tracked because he must navigate through the city’s network of almost 500 automated license plate readers (ALPRs).

These devices, operated by the San Francisco Police Department (SFPD), constitute a major link in the national surveillance network that the vendor Flock Safety is providing to state and local law enforcement. Moore has had enough. At the end of December, he filed a class action lawsuit in a federal courtroom on his behalf and on behalf of his fellow San Franciscans against the city and its police department over this continuous violation of their Fourth Amendment rights.

In his suit, Moore states that Flock ALPRs “make it functionally impossible to drive anywhere in the City without having one’s movement tracked, photographed, and stored in an AI-assisted database that enables the warrantless surveillance of one’s movements.”

Here are some of the topline revelations from Moore’s lawsuit:

Suspiciousness surveillance: Of the over 1 billion license plate scans collected by 82 agencies nationwide in 2019, “99.9 percent of this surveillance data was not actively related to any criminal investigation when it was collected.”

Creates “vehicle fingerprints”: “When Flock Cameras capture an image of a car, Flock’s software uses machine learning to create what Flock calls a ‘Vehicle Fingerprint.’ The ‘fingerprint’ includes the color and make and model of the car and any distinctive features, like an anti-Trump bumper sticker or roof rack. Flock’s software converts each of those details into text and stores them into an organized database.”

Tracks social networks: “Flock provides advanced search and artificial intelligence functions that SFPD officers can use to output a list of locations a car has been captured, create lists of cars that have visited specific locations, and even track cars that are seen together.”

Data stored indefinitely: “The data that Flock Cameras collect belong to the SFPD but Flock retains data on a rolling 30-day basis. Nothing, however, prevents the SFPD or its officers from downloading and saving the data for longer than SFPD’s 365-day retention period.”

Flock doesn’t just see and record – it thinks and analyzes:

“ALPR technology is a powerful surveillance tool that is used to invade the privacy of individuals and violate the rights of entire communities. ALPR systems collect and store location about drivers whose vehicles pass through ALPR cameras’ fields of view, which, along with the date and time of capture, can be organized by a database that develops a driver profile revealing sensitive details about where individuals work, live, associate, worship, protest and travel.”

Moore’s lawsuit poses a profound constitutional question: Can a city turn every resident into a perpetual suspect simply for driving on public roads?
​

The Fourth Amendment was written to forbid dragnet surveillance untethered to suspicion, warrants, or individualized cause. Yet San Francisco has quietly constructed a system that records nearly every movement of its citizens, not because they are suspected of wrongdoing, but because technology makes it easy. If this practice is allowed to stand, the right to move freely without government monitoring may become a relic – honored in theory, but surrendered in practice to cameras, algorithms, and convenience.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Utah’s Respect for Personal Identity Puts ICE Arrogance to Shame

12/1/2025

 
Picture
​When your identity is confirmed by a string of numbers in a computer, are you still yourself if the algorithm determines you (the person) are not you (the digital ID)?

One state, Utah, is leading the nation in answering this question with policies that safeguard humans, while Washington, D.C. is heading down the path of reducing humans to algorithms.
Consider ACLU’s Jay Stanley, who praised Utah for its “State-Endorsed Digital Identity” (SEDI), the state’s new framework for digital ID systems. In an approach that should be the norm rather than the notable exception, the Beehive State puts privacy first.

Utah begins with the conviction that identity “is not something bestowed by the state, but that inherently belongs to the individual; the state merely ‘endorses’ a person’s ID.” In other words, our identities belong to us. We are born with them. We own them. With that realization comes new-found respect for privacy and other forms of personal freedom. 

This view of identity stands in sharp contrast to the definition Stanley found in the data-driven world of federal law enforcement. With the feds, identity is becoming something only the state can grant, defaulting to incomplete or faulty digital verification of citizenship.

To be clear, both Utah’s SEDI platform and the federal approach utilize digital ID systems, but one is a case study in digital due diligence while the other illustrates the dangers of slapdash digital recklessness. The federal system is based on incomplete databases, poorly designed architecture, evolving (meaning, far from perfect) technology, and an utter disregard for the constitutional rights of individuals.

Utah’s approach differs from the federal approach in very important ways:
​
  1. Being “user-centric” to ensure that government identification systems are used to empower individuals, not control them.
  2. Being free from surveillance, visibility, tracking, or monitoring by any entity – including private companies and unauthorized government agencies and staff – other than the party that is solely authorized to check the ID.  
  3. Making factors like security, completeness, and accuracy a top priority, in contrast to the unreliability of the facial recognition technology that underlies many of today’s digital verification systems.
  4. Enforcing a user’s “right to paper” (or plastic), including continued and unfettered access to essential government services, even when using only non-digital, physical ID methods.  
  5. Adhering strictly to constitutional rights, particularly Fourth Amendment protections against warrantless searches and dragnet-style fishing expeditions conducted without probable cause.

Stanley goes on to quote the Ranking Member of the House Homeland Security Committee, who reports that an app (called Mobile Fortify) used by Immigration and Customs Enforcement (ICE) now constitutes “definitive” determination of a person’s status “and that an ICE officer may ignore evidence of American citizenship – including a birth certificate.”

That’s bad enough on its own of course, but along the way, the government now sweeps up Americans’ biometric identifiers en masse. The databases Mobile Fortify accesses contain not only our photographs but enough records to constitute a permanent digital dossier.

Congress did not get to review, much less approve, any of this. The American people never voted on it. In fact, the whole thing leaves us wondering what happened to the Privacy Act, signed into law by President Ford in 1974. It has been described as “the American Bill of Rights on data.”  

By declaring that identity is solely digital, determined by stealthy algorithms and policies, and deniable to those whose data is non-existent, incomplete or inaccurate, the federal standard – in sharp contrast to Utah’s – subverts 250 years of traditional, constitutional practice. Remember: Our founders built the world’s most vibrant democracy on pieces of parchment copied by hand.

In any truly free society, identities are personal possessions (to help secure individual rights and facilitate their voluntary participation in society). Identities bestowed by the state ultimately serve only the state.

That we even need to ponder the nature of identity reveals the absurdity of these abuses our personhood and privacy. Nevertheless, here we are. Without transparent conversations and healthy debate, we face a future in which we are whomever the state says we are, made of malleable 0s and 1s, with nothing grounded in the physical world.
​
It's a discussion that, as of now, Utah alone seems committed to having.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Watching the Watchers: If You Are Stopped by ICE, Your Biometric Data Will Be Held for a Generation

11/18/2025

 
Picture
​Robert Frommer, a senior attorney with the Institute for Justice, tells the harrowing story of George Retes, a U.S. citizen and Army veteran of the Iraq War, who was stopped in his car during an immigration sweep.

He was on his way to work when he encountered an Immigration and Customs Enforcement (ICE) roadblock. A melee broke out between protesters and ICE agents. Retes’s car was engulfed in tear gas.

The Institute for Justice reports that agents smashed Retes’s car window, dragged him out, and forced him to the ground with knees on his neck and back – even though he was not resisting.

Despite Retes presenting proof of his citizenship, ICE agents detained him for three days without charges, strip-searched him, and forced him to provide DNA samples. He was not allowed to call a lawyer or given a hearing before a judge. Because Reyes was held incommunicado, his family was left to frantically search for him.

Writing in MSN, Frommer explores what happens to the biometric data ICE collected on Reyes.

“In addition to our DNA, the Department of Homeland Security (DHS) has recently and quietly authorized ICE officers to forcibly collect and retain intimate identifiers: our fingerprints and digital images of our faces. Combined with other technologies, the department is creating a general warrant for our persons, the kind of abuse that ignited the American Revolution.

“A DHS document, meant to ensure our privacy, lays out the facts. An app called Mobile Fortify allows ICE and Customs and Border Protection (CBP) officers to photograph and scan anyone they ‘encounter’ in the field, regardless of citizenship or immigration status. If there isn’t a photo match, officers can collect people’s fingerprints, which are then checked against DHS biometric records. Once DHS has that sensitive data, the app feeds it into CBP’s Automated Targeting System – an enormous watch list that merges border records, passport photos and prior ‘encounter’ images. CBP retains every nonmatch photograph for 15 years, meaning that even if you’re an American citizen mistakenly stopped on the street, the government has your biometric records for (almost) a generation.”
​

Congress should investigate and debate this retention of Americans’ biometric records before reauthorizing a single surveillance authority. And PPSA is hopeful that ICE will be forced to explain its unconstitutional detention of George Reyes when it faces his lawsuit under the Federal Torts Claim Act.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    2025 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Biometric Data
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Data Privacy
    Digital Privacy
    Domestic Surveillance
    Due Process
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    The White House
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2026. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank