Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

PPSA to Supreme Court: Geofence Warrants Threaten Religious Liberty

3/13/2026

 
Picture
​The Project for Privacy & Surveillance Accountability has filed an amicus brief in the U.S. Supreme Court case United States v. Chatrie, warning that geofence warrants threaten not only Americans’ Fourth Amendment rights, but also our religious liberty and freedom of association.

PPSA previously urged the Court to hear this case and rein in geofence warrants as modern digital general warrants. These warrants compel technology companies to turn over location data for every device within a defined geographic area. Investigators then sift through the movements of potentially hundreds –sometimes thousands – of people in hopes of identifying a suspect.

Now that the Court has granted review, PPSA explains in its amicus brief that this dragnet surveillance exposes something far more sensitive than physical location. Location data can reveal belief, identity, and association.

“Geofence warrants also threaten core First Amendment freedoms by enabling surreptitious mass intrusions into sensitive spaces like places of worship,” the PPSA brief explains. 

A geofence warrant could easily capture the identities of everyone attending a church service, synagogue gathering, mosque prayer, or religious conference. In practice, that means the government could obtain what amounts to a list of worshippers.

The facts of the case illustrate the danger. The geofence search used by investigators in Chatrie encompassed Journey Christian Church in Midlothian, Virginia, capturing the location data of anyone present at the church at that time who carried a smartphone with Google location services enabled. 

That possibility raises profound First Amendment concerns. Location data can expose deeply personal religious information, including “faith affiliation; sacrament participation; belief shifts via changing attendance or visiting a new church; or involvement in recovery ministries.” 

The Supreme Court has long recognized that government surveillance of association can chill constitutional rights. Americans who believe their religious participation may be quietly recorded by the government may think twice before attending services or participating in religious life.

That chilling effect is precisely what the First Amendment was designed to prevent.

PPSA’s brief urges the Court to recognize that geofence warrants do more than raise Fourth Amendment questions about search and seizure. They also threaten the First Amendment freedoms that protect Americans’ ability to worship, gather, and associate without government monitoring.

After all, in the digital age, tracking where people go can reveal who they are, what they believe, and whom they stand beside.
​
The Supreme Court now has the opportunity to make clear that the Constitution protects those freedoms from the reach of dragnet surveillance.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

New Threat: Using AI to Hack AI

3/12/2026

 
Picture
In the Terminator movies, the grand finale is often a robot-on-robot fight to the death. That is happening in real life as well – except it is not always the good robot that wins.

Artificial intelligence is the most powerful digital tool ever created. Now a disturbing breakthrough in criminal enterprise has emerged: using one AI system to hack another. At stake is the security of nearly everything – personal identities, bank accounts, and perhaps soon every commercial and government activity secured by blockchain, not to mention trillions of dollars of value stored in cryptocurrency.

Nilesh Christopher of The Los Angeles Times reports that Gambit, an Israeli cybersecurity firm, revealed last month that hackers used Anthropic’s Claude AI system to steal 150 gigabytes of data from Mexican government computers. The heist exposed the personal information associated with roughly 195 million identities (some duplicates) drawn from nine Mexican agencies – including tax records, vehicle registrations, birth certificates, and property ownership data.

Claude is designed to resist exactly this kind of abuse. Anthropic, like other AI companies, maintains teams dedicated to stress-testing their chatbots and probing them for weaknesses. But AI can do almost anything faster and better – including hacking. Gambit found that the attackers were able to “jailbreak” Claude with the help of another AI: OpenAI’s ChatGPT. The second system reportedly analyzed Claude and helped reveal the credentials needed to weaponize it.

This development threatens the foundations of emerging AI-driven and blockchain-based systems. Curtis Simpson told Christopher that because AI “doesn’t sleep … it collapses the cost of sophistication to near zero.”

In other words, cybercrime no longer requires a digital army of hackers hunched over laptops in Shanghai or Tirana, fueled by endless supplies of Club-Mate and Cheetos. With the right prompts, AI can attack a problem relentlessly – probing, testing, and refining its methods until it succeeds.

And the target surface is growing. With the consolidation of Americans’ personal data from dozens of federal agencies under the Trump administration, AI-enabled hackers may soon be able to dip into one enormous resource instead of many smaller ones. As blockchain systems spread across finance and government, expect AI tools to become not just powerful allies – but dangerous adversaries to one another.

This development suggests a growing need for startups with deeper expertise in the cyberdefense of AI. It also suggests that for all the contributions of the Ph.D. philosopher hired by Anthropic to instill a sense of ethics in Claude, gaps still remain.   
​
Companies might want to look to the world of science-fiction and devise commandments as strict as Isaac Azimov’s “Three Laws of Robotics[A1],” designed to prevent robots from harming humans. Only in this case, such rules would prevent AI from harming other AI systems – and the rest of us in the process.​

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

PPSA Commends Rep. Andy Biggs for the Reintroduction of the Protect Liberty and End Warrantless Surveillance Act

3/10/2026

 
Picture
Rep. Andy Biggs, PHOTO CREDIT: Gage Skidmore
​“National security and civil liberties are not mutually exclusive,” said Rep. Andy Biggs (R-AZ). “We can give our intelligence professionals the tools they need to target foreign threats while ensuring that Americans are not subjected to unconstitutional surveillance.”
 
Rep. Biggs last week underscored that philosophy by reintroducing the Protect Liberty and End Warrantless Surveillance Act. His bill would bring powerful reforms to Section 702, which authorizes federal intelligence agencies to spy on foreign targets on foreign soil but has often been used by the FBI to spy on Americans. This authority must be reauthorized by April 20 or expire.
 
Among its many provisions, the Protect Liberty Act would:
 
  • Require a warrant before information collected under Section 702 could be used to inspect the communications of people inside the United States.
 
  • Bar the government from purchasing or obtaining Americans’ personal digital data – a practice currently carried out by more than a dozen federal agencies, from the FBI to the IRS.
 
  • Sunset the “Make Everyone a Spy” provision slipped into the last reauthorization that requires virtually every business or house of worship to secretly facilitate spying on its customers and congregants.
 
  • Require the secret FISA courts to appoint legal experts (amici curiae) with security clearances to represent the civil liberties of the American people.
 
  • Reauthorize Section 702 for two years from this April, ensuring that the next reauthorization does not arise during a budget or election season, when Congress will be too busy to take a careful look at how this authority has worked and how it might need to be adjusted.
 
Despite talk on the Hill of a “clean” reauthorization of Section 702, Rep. Biggs’ bill should get the attention of civil liberties champions across the ideological spectrum, from the House Freedom Caucus to Demand Progress.
 
Polls show that vast majorities of Americans in both parties are deeply concerned about government agencies that treat privacy as a luxury and the Fourth Amendment as a nuisance.
 
“The Protect Liberty Act is the most important government surveillance reform measure in several generations – protecting Americans’ constitutional rights while leaving in place important authorities to keep the American people safe from foreign threats,” said Bob Goodlatte, former Chairman of the House Judiciary Committee and Senior Policy Advisor to PPSA.
​
“FISA Section 702 was enacted by Congress to enable the surveillance of foreign threats on foreign soil, but has been used in recent years by the FBI for domestic spying,” Goodlatte said. “It has been abused to spy on millions of Americans, including judges, sitting Members of Congress, 19,000 donors to a congressional campaign, and countless others.
 
“PPSA commends Subcommittee Chairman Andy Biggs for bringing this reform into the debate over the reauthorization of Section 702,” Goodlatte said. “We are hopeful that Republicans and Democrats on the House Judiciary Committee will once again pass it and that President Trump will sign it into law."

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

How Hackers Can Use Tire Sensors to Track Your Driving Habits

3/9/2026

 
Picture
​The Internet of Things (IoT) strikes again. Most modern vehicles possess a tire pressure monitoring system (TPMS), a legal requirement since 2007. A recent study shows that it is possible to capture unencrypted Wi-Fi messages sent by TPMS sensors. Each sensor sends a unique ID number, which makes tracking specific vehicles child’s play for a hacker.

Think about this for a moment – the average car or truck is broadcasting four such unique IDs (one per tire), with no need for license plate readers with high-tech cameras and AI software. That, says the IMDEA Networks Institute, “makes TPMS-based tracking cheaper, harder to detect, and more difficult to avoid than camera-based surveillance, and therefore a stronger privacy threat.”

A motivated hacker need only place a series of low-cost receivers near the appropriate parking lots and roads. Within weeks:

“These tire sensor signals can be used to follow vehicles and learn their movement patterns. This means a network of inexpensive wireless receivers could quietly monitor the patterns of cars in real-world environments. Such information could reveal daily routines, such as work arrival times or travel habits.”

It gets worse: TPMS signals can even be captured from moving vehicles. Some sensors reveal actual tire pressure values (as opposed to merely “Low”), which could, for example, be used to determine if a vehicle is carrying a heavy payload or to distinguish vehicles by type. Pretty soon we’re in Mission: Impossible territory.

As is so often the case with the IoT, safety was the motivation behind the development of tire pressure monitoring systems in the first place. Because privacy was never a consideration, privacy-by-design protections were missing from the start. The result is a familiar IoT pattern: unencrypted signals and wide-open vulnerabilities becoming the rule rather than the exception. When it comes to privacy issues, safety never seems to stay in its lane.

“Our findings show the need for manufacturers and regulators to improve protection in future vehicle sensor systems,” notes researcher Yago Lizarribar. If nothing changes, yet another safety tool will be perverted into an instrument of general population surveillance.

But change does not seem to be an industry priority. As Aaron Pruner of CNET points out, we’ve had sixteen years to address this vulnerability. A study by Rutgers University and the University of South Carolina identified the problem in 2010, a mere three years after TPMS was mandated.
​
Which means that if TPMS sensors were kids, they’d be old enough by now to start driving – and be tracked every mile of the way.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR FOURTH AMENDMENT RIGHTS

What the Anthropic/OpenAI Story Is Really About

3/8/2026

 
Picture
​The media reported on the drama of the Pentagon’s AI contracts as a horse race: Anthropic tried to limit what the War Department could do with the company's Claude AI product. The administration subsequently rescinded all government contracts with the company. OpenAI offered its products as the alternative and won the day.

But beneath this drama lies a deeper and more dangerous reality: In the absence of meaningful guardrails, the AI tech of any company can be used for surveillance and – if combined with data collected under Section 702 of the Foreign Intelligence Surveillance Act (FISA) – could allow government employees across the federal bureaucracy to run searches on Americans’ private communications.

Such AI-powered surveillance could extend far beyond the Department of War’s use cases and even the Justice Department’s FBI investigations. Government AI-enabled mass surveillance of the domestic population would:

  • Not be subject to any oversight authority – constitutional or statutory
 
  • Not be encumbered by recent reforms like 2024 RISAA (Reforming Intelligence and Securing America Act)
 
  • Be supercharged by the dismantling of long-standing information silos and the removal of safeguards that once limited the sharing of Americans’ private data between agencies – from the Department of Homeland Security to the IRS.
 
  • All done without a warrant – without any court supervision of the government’s invasion of your privacy.

The danger of AI surveillance in a government that shares data between agencies should prompt Congress to strengthen Fourth Amendment privacy protections. With such a vast datascape available to the world's most powerful government – where many existing restrictions have already been weakened – we otherwise risk the irrevocable loss of personal privacy and the rise of a permanent surveillance state.

We need to come to terms with the fact that AI tech makes rummaging through our private lives and personal histories easier and faster than anyone could have imagined even a few years ago. Americans’ communications could become permanently accessible to the prying eyes of government agents in almost any agency with a whim (or a political directive) to pursue.

It wasn't supposed to be this way. AI was supposed to have guardrails, as was Section 702, enacted by Congress to enable the surveillance of foreign threats on foreign soil, but has instead been used by the government to search the private communications of Americans without a warrant.

RISAA was a noble attempt to rein in the misuse of Section 702 as a domestic spy tool. Its reforms included oversight and restrictions on FBI searches involving people inside the United States. It implemented rules for queries involving high-profile groups or individuals. It established training and accountability measures, while enhancing oversight of the two secret courts FISA created.
​
These were important reforms, but they were weakened by last-minute changes to the bill. When Section 702 comes up for renewal next month – this time in the context of an AI juggernaut – it may well be our last chance to protect our freedoms while protecting national security. 

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR FOURTH AMENDMENT RIGHTS

Wisconsin’s Supreme Court Sidesteps the Need for a Warrant for Data in the Cloud

3/5/2026

 
Picture
​The Wisconsin Supreme Court recently upheld the conviction of Andreas W. Rauch Sharak for possession of child pornography. This crime is contemptible – and we support every lawful means to apprehend and convict the vile people who traffic in such material.

But it needs to be pointed out that in this case, the court and prosecutors sidestepped the need for a probable cause warrant, as required by the Fourth Amendment. In so doing, they inadvertently widened a loophole in the treatment of data held by third parties, from Google to Apple, from servers to the cloud. As a result, the privacy of law-abiding Americans and the security of our most personal and intimate data are now more vulnerable than ever.

The Case

Rauch Sharak’s conviction involves Google, which routinely flags files containing potential child sexual abuse material (CSAM) for the National Center for Missing & Exploited Children. If that non-profit organization deems files to contain child pornography, they are forwarded on to law enforcement. In this case, the files were referred to the Jefferson County Sheriff’s Office in Wisconsin, where a detective viewed them without a warrant.

The detective then obtained a search warrant to search Rauch Sharak’s home and devices. This resulted in Rauch Sharak being charged with 15 counts of possession of child pornography.

Wisconsin’s Ruling

The state’s highest court upheld a circuit court’s conviction on the grounds that Google had not acted as “an instrument or agent of the government.” This distinction matters, because if Google was deemed a government actor, its searches would necessarily be subject to the Fourth Amendment’s requirement that law enforcement obtain a warrant based on individualized probable cause before conducting a search.

Nor did the court believe that the detective needed to obtain a warrant to view the forwarded files.

“In this case, we determine that law enforcement did not need a warrant before opening and viewing the files in the CyberTip because law enforcement’s search falls under the private search doctrine,” the Wisconsin Supreme Court held. “Under that doctrine, the government does not conduct a ‘search’ under the Fourth Amendment when it repeats a search by a private actor and stays within the scope of the private search.”

The court also stated:

“Seemingly without exception, federal circuit courts and other state supreme courts have held that ESPs [electronic service providers] like Google are private actors when searching for CSAM on their platforms.”

We commend Google for its “zero tolerance” policy for CSAM in its terms of service. But when the government gets involved, so should the Fourth Amendment.

PPSA’s Brief

In our amicus brief before the Wisconsin Supreme Court, PPSA took issue with such “overbroad interpretations of the third-party doctrine.”

The court overlooked a major exception to the private-actor theory – Carpenter v. United States (2018) – in which the U.S. Supreme Court unanimously held that obtaining a suspect’s historical cell-site data constituted a search under the Fourth Amendment.

We told the court that “Carpenter recognized that the Fourth Amendment protects privacy interests that would have been recognized as reasonable at the time of the Founding, notwithstanding advances in technology that make encroachments upon such interests easier.”

Like the postal systems of early America, the Founders would have easily understood that individuals maintain an expectation of privacy when entrusting personal communications or materials to third parties for storage or delivery. Today, however, it is nearly impossible to store private information without relying on third-party providers like Google, Apple, Amazon, and others. For users, the password-protected accounts of Google Photos would have established a subjective expectation of privacy.

The evidence also clearly shows that when Google conducts automated searches, it may function less like a private actor and more like a deputized investigator. At least one court applied state law holding a third party that possesses CSAM-detection software may face liability if it fails to deploy it. Google – a heavily regulated company operating under significant legal pressure – thus begins to resemble a government partner, raising serious Fourth Amendment concerns.

In the wake of this ruling, the government’s ability to compel private actors like Google to perform warrantless searches will only grow. Powers used today to catch CSAM crimes could be used tomorrow to open up our emails, texts, personal photos, and online searches to the government for any reason it chooses. According to the Wisconsin Supreme Court’s interpretation of the private search doctrine, if Google viewed your data, then the government can too. That means the Fourth Amendment becomes a dead letter for any data entrusted to a third party, i.e., nearly all data in our digital age.
​

As lower courts continue to chip away at Carpenter, the Supreme Court has an opportunity in United States v. Chatrie to revisit these issues for the first time since Carpenter. We hope they decide to reaffirm the clear, bright constitutional line defining when digital searches conducted through private intermediaries become government action – and when Americans’ most personal data must be protected from unreasonable searches and seizures.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Meta’s AI Training Includes Smart Glasses Footage Capturing Users Undressing, Having Sex, Sitting on the Toilet

3/5/2026

 
Picture
​There is a point early in a marriage when spouses get comfortable and uninhibited around each other in the bedroom and even the bathroom. That’s because there is no third set of eyes in the room… unless one of them just happens to be wearing a pair of smart glasses.

We recently covered the perils and pitfalls of Meta adding facial recognition software to its Ray-Ban smartglasses. Now Victor Tangermann of Futurism has uncovered a genuine horror story about private images captured by these glasses, millions of which are already in circulation.

Meta, in order to refine its AI imaging, sends footage from consumers’ glasses to contractors in Kenya and other countries to label them for training. This tedious process is necessary to enable AI to learn to recognize everyday objects.

At that point, almost anything recorded by Meta glasses is liable to be sent abroad for data annotation.

“I saw a video, where a man puts the glasses on the bedside table and leaves the room,” one data annotator told two newspapers in Sweden. “Shortly afterwards his wife comes in and changes her clothes.”

Another data annotator said: “In some videos you see someone going to the toilet, or getting undressed.”

Tangermann reports that other footage included “imagery of people’s bank cards, users watching porn, or even filming entire ‘sex scenes.’”

Meta customers have no recourse. Data protection lawyer Kleanthi Sardeli told the Swedish press, “Once the material has been fed into the models, the user in practice loses control over how it is used.”
​
Of course, as the Internet of Things weaves together Ring cameras, cloud-based voice-activated AI assistants, baby monitors, and robot vacuums, we are all subject to being surreptitiously recorded at, well, inconvenient moments. But none of them have the reach into personal privacy that happens when one spouse is wearing a pair of smart glasses and the other announces that the toilet paper holder is empty.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Chairmen Jordan and Mast Challenge UK Home Secretary to Come Clean on Order to Break Apple Encryption Around the World

3/2/2026

 
Picture
Reps. Jim Jordan and Brian Mast
Rep. Jim Jordan, Chairman of the House Judiciary Committee, and Rep. Brian Mast, Chairman of the House Foreign Affairs Committee, are urging the United Kingdom Home Secretary to reveal details of a secret order to Apple that may kill encryption for Americans and Apple customers around the world.

The secret order involves Apple’s Advanced Data Protection, which offers customers end-to-end encryption so strong that even Apple itself does not have the ability to break it. As a result, journalists and their sources, women and their children hiding from stalkers, dissidents around the world, businesses communicating about proprietary products, and people who simply value their privacy, all rely on Apple’s ADP to protect their communications.

In February 2025, the UK Home Office – roughly equivalent to the U.S. Department of Homeland Security – issued a Technical Capability Notice (TCN) to Apple demanding access to end-to-end encrypted data stored in Apple’s iCloud. In order to be able to continue to serve Britons with other products and services, and to protect customers’ privacy, Apple was forced to comply with the law by disabling ADP for 35 million iPhone users in the UK.

This had the additional unfortunate effect of depriving Americans and people from around the world of the ability to privately communicate with UK Apple customers – including with other Americans inside the UK.

The UK’s Gag Order – an American Company Cannot Talk to Its Government

“However, it remains unclear whether this action satisfies the UK’s demands, particularly as the order reportedly extends to data of users outside the UK, including American citizens,” Jordan and Mast wrote in a letter to Home Secretary Shabana Mahmood.

Such an order is not only in violation of the Clarifying Lawful Overseas Use of Data (CLOUD) Act, which authorizes the U.S. to enter into data-sharing agreements with the UK and a few other countries, but prohibits orders that require providers to decrypt data. Incredibly, the UK government’s TCN imposes a gag order on Apple that makes it a criminal violation for this American company to petition or even discuss the order with the U.S. Department of Justice.

The “Bare Details” of the TCN Are Not Enough

Since then, a tribunal in the UK rejected the idea that “the revelation of the bare details of the case would be damaging to the public interest or prejudicial to national security.”

Late last year, the Investigatory Powers Commissioner, which advises Prime Minister Keir Starmer, agreed with the tribunal’s ruling, saying that disclosure of some details about the TCN is necessary for “a mature and informed public debate.” Yet no such briefing is in the works, which is why the chairmen are now making a direct request to UK Home Secretary Mahmood to provide a briefing that would spell out the terms of the TCN to the committees by March 11.

What’s more, the committees need more than the “bare details” of the TCN to ensure that the actions of the UK government are within the terms of the CLOUD Act. Otherwise, how could Chairmen Jordan and Mast ascertain if the order weakens “the security, privacy, and constitutional rights of American citizens”? PPSA applauds the chairmen for taking this stand for the right of Americans.

The U.S. Can Suspend the CLOUD Act Agreement with the UK

Bob Goodlatte, former Chairman of the House Judiciary Committee and PPSA Senior Policy Advisor, who helped lead the passage of the CLOUD Act in 2018, is pointing to a way out if the UK does not respond to Jordan and Mast.

In a letter to Attorney General Pam Bondi on Dec. 12, Goodlatte noted that the CLOUD Act was intended to streamline cross-border cooperation, but “was never intended by Congress to be leveraged by a foreign partner to compel any form of ‘backdoor’ access or other types of decryption assistance.”
​
  • Goodlatte noted that the CLOUD Act anticipated the danger that a foreign partner would try to exploit the goodwill of the United States. “Accordingly,” Goodlatte wrote to the attorney general, I urge the Department of Justice to invoke Article 12.3 and suspend the Agreement unless and until the UK withdraws its use of TCNs.”

The letter from Chairmen Jordan and Mast did not invoke the possibility of taking this strong action. But Home Secretary Mahmood would be wise to realize that this is likely a step the Trump administration and Congress will take if the British government continues to remain resistant to American concerns.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

What China’s AI Surveillance State Tells Us About the American Future

3/2/2026

 
Picture
​Why do so many Americans object to the expansion of surveillance networks like Flock technology that can track where we drive, pervasive Ring networks that show where we walk, and government purchases of our personal data that reveal information about us that is more sensitive than a diary? After all, this is for our own good – to protect us. We can trust the government, right?

One reason for alarm among the civil liberties community is that we have seen how these separate surveillance systems can be woven together by AI to create a comprehensive surveillance state. This used to be the stuff of dystopian science fiction. Today, it is a functioning model we can see in real time across the Pacific.

Consider the Fujian Police Academy in China, which at the end of last year released an internal document that shows how AI can detect unrest by weaving together actionable intelligence from sound sensors, cameras, reports from paid community spies called “grid workers,” and other sources.

The China Media Project unearthed and analyzed this document (linked here for Mandarin readers) showing how comprehensive surveillance can further the cause of “social governance.” China Media Project reports that:

  • The so-called grid workers have apps to alert the authorities to anything unusual. The telecom giant Huawei has filed a patent to pinpoint the location of images uploaded by these neighborhood spies – “and can even turn the locations depicted in the photos into a 3D model.”
 
  • Guizhou Normal University showed interest last year in using OpenAI’s GPT model to rate individuals’ “personality traits,” “long-term emotional states,” or “degree of exposure on negative cultural influences.”
 
  • The Southwestern University of Political Science and Law in Chongqing has created a risk-monitoring system to follow people filing petitions seeking redress over a wrong done to them by a local cadre or peer. Targets who have spread “inflammatory” comments on social media are flagged as risks.

China Media Project summarizes:

“Throughout the past year, institutions across China, both private and state-owned, have proposed variations of the same system: taking big data from China’s extensive surveillance system – including input from street cameras and satellites, noise sensors, social media posts, as well as reports from social services – and feeding it into AI models to aid predictive policing.”

Of course, Washington is not Beijing. We are not going to find ourselves having to memorize the platitudes of our Dear Leader and spout them online in order to enjoy internet and travel privileges. But the technological ambition – to fuse disparate surveillance streams into systems for “predictive policing” – is not uniquely Chinese. This ambition was reflected in the post-9/11 attempt by the Pentagon to create “total informational awareness” – an ambition finding new life in the many surveillance elements that PPSA reports on daily.

Unlike the “netizens” of China, we can urge our elected leaders to take us off the path that leads to a surveillance state. Congress has an immediate opportunity to do exactly that. One step off this path would be the passage, this April, of measures to end the purchasing of Americans’ most sensitive and personal data by the FBI, the IRS, the Department of Defense, the Department of Homeland Security, and other federal agencies.
​

The lesson from China is not that America is doomed to follow the same path – but that once surveillance systems integrate, pulling them apart becomes exponentially harder. We will keep you posted as the surveillance debate heats up in Congress.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Hacker Accidentally Transforms 7,000 Vacuum Cleaners Around the World into Spybots

3/2/2026

 
Picture
​The Internet of Things (IoT) remains a glass house when it comes to privacy, as evidenced by this recent headline: “MAN ACCIDENTALLY GAINED CONTROL OF 7,000 ROBOT VACUUMS IN 24 COUNTRIES WHEN HE TRIED TO GET CREATIVE.”

Sammy Azdoufal just wanted to see if he could control his fancy new China-made DJI Romo vacuum cleaner with his PlayStation 5 controller (because, why not?). With the help of some AI coding tools, he not only succeeded, but soon found himself in charge of every currently connected DJI vacuum around the world, with access to camera feeds, microphones, floorplan maps, and more. Because of the available Internet Protocol addresses associated with each device’s connection, he also had the ability to determine their approximate location.

Now imagine what a burglary syndicate could do with that information. Or, for that matter, Chinese intelligence, which under Chinese law has rights to all the data collected by Chinese companies. The ability to vacuum up the personal information of people around the world is a big lesson in consumer privacy. It also portrays the Wild West that IoT has become, which Live Media News summed up nicely:

“It seems like the smart-home sector is constantly urging us to embrace the ‘trust us’ design principle. Convenience is always the selling point: let the thermostat anticipate your routine, let the doorbell recognize a face, and let the vacuum clean while you’re away. However, in reality, convenience typically translates to ‘cloud.’ Furthermore, cloud frequently implies that someone, somewhere, created a permissions system that must be flawless every day, forever, across all updates, regions, and hurried sprints. Even for businesses that prioritize security, that’s a high standard. Many don’t.”

Which should give us all pause as we consider whether we really need connected refrigerators, doorbells, coffee makers, vacuum cleaners, sex toys, and more. Our personal privacy seems a terrible thing to wager in the name of a little more convenience.
​
Azdoufal just happened to do the right thing by reporting a vulnerability that he didn’t have to publicize (and one that he wasn’t deliberately looking for in the first place). In other words, we got lucky this time.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR FOURTH AMENDMENT RIGHTS

Will Meta Sneak Facial Recognition Smartglasses Past “Distracted” Privacy Advocates?

2/24/2026

 
Picture
​“Great to see you … Bob … How’s … Maggie ... and those three wonderful … dogs of yours.”

You have to admit, it will be a boon to politicians. Adding facial recognition software to smartglasses will enable them and anyone at a cocktail party to dispense with all those tiresome strategies for remembering names and familiar facts about the person in front of them. According to a 2025 internal company memo obtained by The New York Times, Meta plans to quietly equip its line of smartglasses with facial recognition technology dubbed “Name Tag.”

Facial recognition technology is one of the most robust privacy-destroying tools. This was an idea that was floated and dropped five years ago for Meta’s social media platforms. Now it is back, this time as a wearable in Meta’s Ray-Ban and Oakley smartglasses. The strategy behind this policy reversal is breathtakingly cynical.

The Meta memo held that the new feature’s debut would go largely unnoticed if it were launched “during a dynamic political environment where many civil society groups that we would expect to attack us would have their resources focused on other concerns.”

This presumably is a nod to the looming FISA Section 702 debate in April, as well as a torrent of other privacy-destructive technologies, like the unfolding national network of Flock cameras.

So plan ahead. You might be at lunch several years from now with a bunch of business prospects wearing Ray-Bans or Oakleys, finding them unusually quiet from time to time. That’s because they will be reading up on you in real time with the help of Meta’s AI assistant.

Meta is weighing identification only of people who are on its platforms, not strangers you pass on the street. But we are skeptical. Even without facial recognition tech installed, the company’s smartglasses can be hacked and made to identify strangers. And let’s not forget that Meta smartglasses already offer livestreaming and the ability to post directly to Instagram.
​
But try to think of the bright side: You’ll never have to introduce yourself again.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

If Social Media Is a Drug, Can Speech Be Medically Regulated?

2/24/2026

 
Picture
Mark Zuckerberg in a suit, possibly attending a meeting or conference, representing his role as a leader in the technology and social media industry, California, U.S, October 09, 2025. PHOTO CREDIT: FotoField
​Anonymity online can be a mask that allows people to say ugly, hateful or untrue things without taking responsibility for them. But it can also be a shield that protects women hiding from abusers, whistleblowers one step ahead of their pursuers, journalists reaching out to confidential sources about wrongdoing, and consumers searching online for answers to questions about their health that they’d rather not have anyone know about.

This is why the current effort by the Immigration and Customs Enforcement (ICE) agency to use emergency subpoenas to force Big Tech companies to reveal the identities of Americans who make critical posts about ICE is so dangerous. If this practice sticks, it will likely migrate to other federal agencies and erode anonymity online.

But the shedding of anonymous speech might come by a different route – not from executive-branch meddling or legislative mistakes, but from lawsuits claiming harms from child internet “addiction.”

Dan Frieth of the digital anti-censorship advocacy group, Reclaim The Net, listened to five hours of Meta CEO Mark Zuckerberg’s testimony in a Los Angeles civil case and distilled it to a jarring and important warning – the age of anonymity could be coming to an end at the hands of the trial bar.

Zuckerberg testified in one of 1,600 lawsuits over internet addiction. In this case, a woman claimed that at age nine Meta’s Instagram addicted her, plunging her into a hell of anxiety, body dysmorphia, and suicidal thoughts.

Frieth notes that the science of internet addiction is “genuinely disputed.” He writes:

“None of this means the harms alleged are fabricated. It means the word ‘addiction’ is doing heavy rhetorical and legal work, and the policy consequences are far beyond anything a jury in Los Angeles will decide.

“‘Addiction’ is how you get a public health emergency. A public health emergency is how you get emergency powers and make it easier for people to overlook constitutional protections. Emergency powers applied to the internet mean mandatory access controls. And mandatory access controls on the internet mean the end of anonymous and pseudonymous speech.

“When social media is classified as a drug, access to it becomes a medical and regulatory matter” justifying “identity verification, access controls, and a surveillance architecture that follows users across every platform and device.”

Frieth notes that a win for the plaintiff in this case would strip the current law protecting platform design decisions. This danger is not theoretical. Frieth reports that Zuckerberg repeatedly suggested that any age verification mandate – and thus identification – be shifted from platforms to owners of operating systems. Zuckerberg would thus toss his liability hot potato from Instagram to Apple and Google.

“This is more than age verification,” Frieth concludes. “It is a national digital ID layer baked into the two operating systems that run the majority of the world’s smartphones.”

There are a lot of competing interests in this case – the safety of children, the nature of the internet, and the value of free speech. Juries don’t have to balance these equities. They can just side with the plaintiff and inadvertently make policy for U.S. tech – and by extension, the world.

Any new approach to child safety should not require adults to give up speech rights recognized in this country since Alexander Hamilton, James Madison, and John Jay wrote collectively as the pseudonymous “Publius” in The Federalist Papers.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Sens. Mike Lee and Dick Durbin Reintroduce the SAFE Act to Require Warrants Before the Government Can Help Itself to Americans’ Communications

2/24/2026

 
Picture
U.S. Senator Mike Lee (R-UT) and U.S. Senator Dick Durbin (D-IL)
Sens. Dick Durbin (D-IL) and Mike Lee (R-UT) have updated and reintroduced the Security and Freedom Enhancement (SAFE) Act – a measure that seeks to restore the constitutional balance between national security and the civil liberties of the American people.
 
“The bill’s full name says it all,” said Bob Goodlatte, former Chairman of the House Judiciary Committee and PPSA Senior Policy Advisor. “Congress can reauthorize FISA Section 702 to protect the American people from foreign threats, while adding provisions that safeguard our most precious constitutional rights here at home.
​
  • The SAFE Act imposes the Fourth Amendment’s warrant requirement when the government wants to inspect the communications of an American, yet preserves robust intelligence collection against foreign adversaries overseas.

  • The legislation closes the data broker loophole, requiring a warrant before agencies such as the FBI, the IRS, and others can search Americans’ most sensitive digital data purchased from third-party brokers.

  • The bill bolsters the role of amici, or civil liberties experts, who advise the secret FISA courts in cases involving journalists, religious institutions, political activity, and other matters that strike at the heart of Americans’ First Amendment rights.

  • It also closes the “make everyone a spy” loophole, which currently allows the National Security Agency to secretly compel a vast range of businesses – including those offering free Wi-Fi – to produce the communications of their customers.
 
There is growing talk on Capitol Hill about a “clean” reauthorization of Section 702 – one that would reject any reforms and leave intact the FBI’s ability to conduct warrantless searches of Americans’ communications swept up in the NSA’s global data trawl.
​

The SAFE Act leaves intact surveillance targeting foreigners abroad, demonstrating that Congress does not need to choose between security and liberty. The bipartisan leadership of Sens. Lee and Durbin reflects polling that shows large majorities of Republicans and Democrats favor a warrant requirement for Americans’ data.
 
Section 702 must be reauthorized by Congress by April 20, or it will expire. As the reauthorization debate accelerates, the reforms contained in the SAFE Act should not be treated as optional accessories – they should be the starting point of any serious discussion about surveillance reform.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

ICE Wants to Spy on Americans’ Political Opinions – Soon Other Agencies Will

2/22/2026

 
Picture
​ICE has become enough of a household word that, like NASA, it’s no longer necessary to spell out its acronym. ICE’s aggressive enforcement of immigration law, now the nation’s hottest political flashpoint, is dividing Americans like nothing else in recent memory. Regardless of where you stand on ICE and illegal immigration, we should all agree that ICE’s massive expansion into domestic surveillance is a grave concern for anyone who values the Fourth Amendment and privacy.

When a protester recording video on her phone wants to know why a masked agent is taking down her information and he replies – “Because we have a nice little database and now you’re considered a domestic terrorist!” – Sheera Frankel of The New York Times rightly suggests that we’ve entered uncharted territory. Political dissent is now being treated as domestic intelligence.

The masked agent was not kidding. The Department of Homeland Security (DHS) is launching a pressure campaign to get Big Tech to identify persons who post content deemed “critical” of ICE. Rather than traditional investigative work, the government appears to be leaning on something akin to an abuse of process, filing hundreds – if not thousands – of subpoenas intended to compel tech giants to cough up user data.

This data grab of lawful speech is unprecedented. It amounts to using an exceptional legal maneuver – an emergency procedure meant for crimes like child trafficking – to collect constitutionally protected political expression. And let’s be clear about the constitutional claim: The contents of our “friends-only” digital posts are modern “papers and effects,” private possessions the Fourth Amendment was designed to shield from generalized searches.

If tech companies cave (and, as highly regulated companies, they likely will), and ICE plugs the data of protesters into its increasingly Orwellian surveillance architecture, then the genie will already be out of the bottle. Once such a capability is developed, it rarely remains confined to a single mission or a single agency. Surveillance tools migrate. Authorities expand. Bureaucracies replicate what works.

These tools – algorithms housed in digital fortresses – will almost certainly be shared with the FBI, IRS, FTC, SEC, and a dozen other agencies eager for their piece of the silicon pie. And they won’t just target Americans who are anti-ICE. Depending on the political winds of the day, databases built to track one form of dissent can just as easily be turned against pro-choicers, pro-lifers, critics of the administration in power, progressives, or MAGA supporters.
​

This looks less like law enforcement and more like the construction of a permanent political-intelligence system – the start of a security-state apparatus on a scale never before seen, primarily and perversely used to surveil and catalog the political beliefs of Americans. Congress should examine this emerging capability and look to install guardrails when it debates surveillance policy in March and April.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

PPSA Tells Eleventh Circuit that AI-Powered License Plate Tracking Violates the Fourth Amendment

2/17/2026

 

United States v. Slaybaugh

Picture
Artificial intelligence has handed government surveillance a superpower the Founders never envisioned – the ability to quietly track millions of Americans, then rewind their movements later without a warrant.
 
In United States v. Slaybaugh, PPSA is urging the U.S. Court of Appeals for the Eleventh Circuit to draw a constitutional line around the warrantless use of automatic license plate reader (ALPR) databases. At stake is more than one defendant’s conviction. The court must decide whether rapidly evolving surveillance tools will stretch the Fourth Amendment beyond recognition for all Americans.
 
When Public Data Becomes Private Surveillance

Law enforcement offers a simple argument with surface appeal: License plates are visible on public roads, so collecting them invades no one’s privacy. In our brief, PPSA details how that simple argument collapses before the reality of modern surveillance.
 
This case is not about a single camera capturing a passing car. It is the government’s ability to aggregate billions of scans into a searchable chronicle of a person’s life. ALPR systems collect time-stamped and geolocated images of every passing vehicle, and store them indefinitely, allowing officers to reconstruct travel histories “with just the click of a button.” 
 
Far from snapping one static image of a license plate, ALPR systems have the power to tail everyone and anyone in a given city or county. That power transforms fleeting public observations into something fundamentally different – a digital dossier revealing where we sleep, worship, seek medical care, protest, or attend political meetings.
 
The U.S. Supreme Court recognized this danger in Carpenter v. United States (2018), holding that long-term location tracking can trigger Fourth Amendment protections even when a person’s movements occur in public. While Carpenter involved the extraction of a suspect’s geolocation history from a cellphone tower, ALPR surveillance raises the same constitutional concerns – but at a vastly higher scale.
 
The Myth of a Numerical “Safe Harbor”

One of the most significant errors PPSA identifies in the lower court’s ruling is the idea that surveillance becomes unconstitutional only after it collects a certain number of data points or weeks of tracking. The federal court treated the retrieval of 72 plate “reads” over three weeks as too limited to reveal the whole of one person’s movements. This take misreads Carpenter.
 
The danger lies not in how many time police officers choose to view images, but in the existence of the massive surveillance database itself.
 
Car “Fingerprints” and “Digital Time Travel”

PPSA told the court:

  • “In 2019 alone, 1 billion license plate scans were collected, with 99.9 percent not actively related to any criminal investigation. The ALPR system that collected Slaybaugh’s information, Flock, works by creating a ‘vehicle fingerprint’ that includes much more than just a license plate number. Each passing vehicle’s ‘fingerprint’ includes its color, make, model, and distinctive features, like a political bumper sticker. Flock then provides advanced search and artificial intelligence functions that can be used to list locations a car has been captured, create lists of cars that have visited specific locations, and even track cars that are seen together.”

With such databases, officers can effectively travel back in time and retrace anyone’s movements long before suspicion arises. That retrospective power, PPSA demonstrates, far exceeds the general warrants and other abuses the Fourth Amendment was designed to restrain. In colonial America, the King’s agents lacked the ability to catalog every citizen’s movements. Modern technology has erased that practical limitation. Without constitutional safeguards, PPSA warns, the government can monitor entire populations’ travel histories and associations – whether political, romantic, or religious.
 
From License Plates to a Surveillance Ecosystem

ALPR systems are only one piece of a rapidly expanding surveillance architecture. PPSA warns that these tools increasingly integrate with other technologies – including AI analytics, neighborhood camera systems, and vast databases of commercial data sources holding personal information.
 
The concern is not simply about license plates. It is about the emergence of an interconnected surveillance ecosystem capable of mapping people’s lives in unprecedented detail.
 
The Solution Is Already in the Constitution

PPSA’s position is not anti-technology. We acknowledge that modern policing can benefit from advanced tools – so long as they operate within constitutional limits.
 
The solution is straightforward and familiar – requiring law enforcement to obtain a warrant supported by probable cause before querying historical ALPR data. That safeguard preserves investigative power while ensuring judicial oversight of government tracking.
 
The Future of Privacy
​

The Eleventh Circuit’s decision may shape how courts treat digital tracking technologies far beyond license plate readers. As geofenced surveillance, AI drones, and integrated camera networks expand, the dangers of technology will only become more acute, and the constitutional principles at issue in Slaybaugh will only become more urgent.
 
Slaybaugh may well determine whether every time we get in our car, we are freely roaming public streets or becoming caught in a permanent dragnet.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Federal Court Rejects Attorney-Client Privilege for AI Chatbot

2/16/2026

 
Picture
The confidentiality of attorney-client conversations may be a cornerstone of American law, but it has some cracks.

One defendant, Bradley Heppner, on trial for securities fraud and other crimes related to his role as the former CEO of Beneficient, learned the hard way that this privilege does not extend to legal questions put to AI chatbots and virtual assistants.

Federal Judge Jed Rakoff of the Southern District of New York on Tuesday ruled that 31 documents that Heppner generated about his case with Anthropic’s Claude – and shared with his defense attorneys – are not protected by attorney-client privilege.

In an analysis by Moish Peltz and Elizabeth E. Schlissel of the Falcon, Rappaport & Berkman law firm, the reasons for Judge Rakoff’s decision include:

  • The privilege extends to communications only between lawyer and client, made in confidence, not conversations with third parties.
 
  • The terms of service of AI chat tools, including Claude, tell users not to rely on them for legal advice and disclaim an attorney-client privilege. This exposure extends not only to documents generated for Heppner, but any prompts he might have posed to Claude.
 
  • Government prosecutors analogized to the court that “if the defendant had instead conducted Google searches or checked out certain books from the library to assist with his legal case, the underlying searches or library records would not be protected from disclosure simply because the defendant later discussed what he learned with his attorney.”

These are persuasive points about this particular case. Still, the ruling underscores a deeper concern: the ready access the FBI and the judicial process have to all of our financial, legal, and highly personal data being held by third parties.

  • Last year, we noted the gobsmacking ruling by a magistrate judge in New York about a copyright case requiring ChatGPT to preserve billions of user queries. The AI chatbot had promised its 800 million active customers that all their questions and the chatbot’s answers – many of them very personal – were confidential. All it took was one judge in one civil case to undo that promise by requiring ChatGPT to permanently store the queries of one-tenth of the human population.

This order even swept in queries that customers believed they had deleted.

As we noted at the time, “virtually anything asked – no matter how personal – is a permanent legal record that lawyers in a nasty divorce or commercial dispute or a government investigation could pry open with the right legal tools.”

Privacy attorney Jay Edelson wrote in The Hill that this is “a mass privacy violation,” asking: “Could Apple preserve every photo taken with an iPhone over one copyright lawsuit? Could Google save a log of every American’s searches over a single business dispute?”

In a similar way, does the Heppner precedent risk exposing the private reasoning of anyone who has ever asked a chatbot a legal question?

These questions point to the urgent need for guardrails on access to third-party data. At a minimum, consumers deserve clearer warnings, tighter limits on data retention, and stronger legal standards before personal queries are swept into criminal trials or litigation.

A more futuristic concern is the likelihood that AI will one day sit at the counsel’s table. Of course, an attorney will be able to consult his AI under the privilege. But as AI agents specializing in the law earn a credible claim to being part of a legal team, will attorney-client privilege evolve to include client conversations with that AI? Or will consultations between the client and the team AI agent remain a discoverable record?

In the meantime, AI and the cloud should come with their own Miranda warning: Anything you type can and will be used against you in a court of law.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Congress Must Demand Answers on the FBI’s “Sensitive Investigative Matters” Before Reauthorizing Section 702

2/16/2026

 
Picture
The FBI calls them “assessments.” Americans experience them as investigations. 
 
A new Government Accountability Office (GAO) report suggests that many supposedly preliminary inquiries function as probes in waiting – particularly when they involve politics, journalism, or religion. According to the report, posted by the Cato Institute, more than 1,000 individuals and organizations have been subjected to preliminary assessments for investigation, a scope that should trigger immediate congressional concern. (Hat tip to Cato’s Patrick Eddington.)
 
The most alarming category involves so-called “Sensitive Investigative Matters,” or SIM assessments. These are FBI inquiries potentially into political campaigns and candidates, elected officials, journalists, religious leaders, or any other Americans engaged in core First Amendment activities. If any government scrutiny demands transparency and restraint, it is surveillance that begins with our rights to freedom of speech, belief, and association.
 
The GAO found that the FBI converted 48 percent of SIM assessments into Preliminary or Full Investigations. Eddington reports that these sensitive cases were 3.5 percentage times more likely to escalate than ordinary assessments – a statistical red flag for anyone told these probes are narrow, cautious, or exceptional.
 
Eddington writes:
 
“That’s especially alarming since, under Preliminary or Full Investigations, the agents running the case can employ wiretaps or other extremely intrusive and clandestine investigative techniques.”
 
Those tools – from electronic surveillance to confidential informants and covert collection – once deployed, are difficult to unwind. That is why Congress must demand answers before, not after, it reauthorizes Section 702 of the Foreign Intelligence Surveillance Act, which expires in April.
 
Before granting renewed surveillance authority, lawmakers should require the FBI to disclose whether SIM assessments have targeted:
​
  • Members of Congress
 
  • Political candidates and parties
 
  • News organizations, think tanks, and NGOs
 
  • News reporters and opinion journalists
 
  • Churches, temples, and mosques.

At a minimum, the Bureau should provide the relevant oversight committees – especially the House Judiciary Committee – with a full accounting of past SIM targets and the total number of assessments elevated into full investigations. Congress should also ask who authorized these escalations. Were investigative decisions influenced by political appointees? 
 
Two questions cut to the core of our concerns about protecting civil liberties. Why are First Amendment-sensitive assessments more likely to escalate than ordinary cases? And was Section 702 data – intended for foreign intelligence collection abroad but routinely used for warrantless “backdoor” searches of Americans – part of the analytical process driving these decisions?
 
The Founders knew the danger of unchecked investigations aimed at political and religious dissent. They rebelled against general warrants that allowed agents of the Crown to search first and justify later. SIM assessments risk reviving that same model – quiet surveillance justified by internal labels rather than public law.
 
Surveillance powers are easy to grant and hard to retract. Congress should not renew them without first understanding how existing authorities have been used against Americans exercising our most basic freedoms.
 
Congress should make it clear: No answers. No reauthorizations.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

What the Snooping on the Epstein Files Search Histories Reveals About Congressional Oversight

2/16/2026

 
Picture
US Attorney General Pam Bondi
​When photographers zoomed in on Attorney General Bondi during her recent House Judiciary Committee testimony, they captured her holding a printout titled “Jayapal Pramila Search History.” The Department of Justice (DOJ) staff appear to have been tracking what Members of Congress searched for in the unredacted Epstein files, including the searches of Rep. Pramila Jayapal (D-WA).
 
This is just the latest sign that the DOJ needs to embrace congressional oversight, not treat it as a nuisance to be watched closely and managed. House Speaker Mike Johnson responded: “I don’t think it’s appropriate for anybody to be tracking that.” Rep. Nancy Mace (R-SC) called it “creepy.”
 
Here’s the backstory: Earlier in the week, Members of Congress had been given access to unredacted versions of the files in a designated DOJ office building. Once under the roof of the executive branch, Members of Congress were surveilled and their queries recorded. DOJ showed a contempt for the oversight function of Congress that is becoming a hallmark of that department under Democratic as well as Republican administrations.
 
Consider that Biden’s outgoing team at Justice – unhappy with the decision by Congress in 2024 to reassert its oversight authority by allowing Members to attend FISA court proceedings – crafted an absurd set of lengthy and onerous restrictions that effectively neutered this law. President Trump’s DOJ has continued these restrictions, despite Bondi mentioning the need for FISA reform in her confirmation testimony. These restrictions allow a DOJ bureaucrat to evict Members from the two FISA courts at will. They also block senators and representatives from discussing what they heard in court with each other, an absurd restriction that effectively transforms a form of congressional oversight into a gag order for Congress.
 
Sens. Chuck Grassley (R-IA) and Dick Durbin (D-IL) complained to Attorney General Bondi about these ridiculous restrictions to FISA court access in a protest letter sent last November. And the senators recently gave the former general counsel of the National Security Agency the same message: Section 702 of the Foreign Intelligence Surveillance Act has its place as a national security and intelligence tool, warned Grassley, but “constant congressional oversight and vigilance is also essential to ensure that this authority is exercised responsibly,” lest its use as a domestic spying tool continue unchecked.
 
Similarly, telling Members of Congress that they can search the unredacted Epstein files – as stipulated by law – but then secretly keeping tabs on their queries is not a guardrail, as a DOJ spokesperson tried to claim (“DOJ logs all searches made on its systems to protect against the release of victim information.”) Why, then, did the department provide the Attorney General with talking points to take into her hearing that outlined the search histories of Members of Congress?
 
This is just one more sign that Congress must reassert its authority and make it clear to DOJ that it is the one that oversees, not the one that is overseen.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Is a Police Dog Sniffing at Your Door a Fourth Amendment Violation?

2/12/2026

 

Johnson v. United States

Picture
​PPSA filed a brief asking the U.S. Supreme Court to take up the case of Eric Tyrell Johnson, a Maryland man convicted of drug crimes after police brought a drug-sniffing dog to the door of his apartment. When the dog gave a “positive alert” for contraband, the officers obtained a warrant to enter his apartment and found the evidence they suspected was there.
 
Yet at the time of the “sniff-sniff” at Johnson’s door, the police lacked probable cause sufficient for a warrant. The Fourth Circuit Court of Appeals nevertheless upheld Johnson’s conviction.
 
Does this smell right to you? Or was this olfactory intrusion a warrantless search in violation of the Constitution’s Fourth Amendment?
 
This case raises more questions. Would the Framers, who wrote that amendment requiring a probable cause warrant, recognize its application to modern apartment buildings, where dozens or hundreds of occupants share walls and live side by side?
 
And what would it ultimately mean for the wider privacy rights of society if the Court were to allow such searches to remain in place?
 
From Place to Kyllo: Preserving the Fourth Amendment
​

The logic of the Fourth Circuit was derived from United States v. Place (1983). In that case, the Supreme Court held that the combination of an airport canine sniff of luggage in a public setting that could (supposedly) only detect drugs was “sui generis” (court-speak for “unique”) and not a Fourth Amendment search.
 
In our brief, we remind the Court to keep Place (a well-named case, if ever there was one) in its place as a narrow, context-specific rule. Place does not apply to homes, which the Fourth Amendment has always treated as sacrosanct.
 
A better precedent is Kyllo v. United States (2001), in which heat-imaging technology penetrated the walls of a home to reveal intimate details inside it. The Supreme Court found that the use of such penetrative sense-enhancing technology to expose the occupants and interior of a home constituted a search requiring a probable-cause warrant. Thus, Kyllo better follows the original understanding of the Fourth Amendment, focusing on what is being searched and how, not whether the thing the government is looking for is legal or not.
 
PPSA’s brief underscores that in a similar way: dogs bred and trained to find drugs function as biological sense-enhancing technology. Although dogs have been man’s best friend for – well, forever – their use as a drug detection device began only several decades ago. Just like thermal imaging, a canine sniff detects interior activity – odors – without entering the home. As in Kyllo, this technique improperly accessed private information from the interior of a home without a warrant. 
 
PPSA’s brief warns the Court that if police canines can sniff at apartment thresholds without a warrant, it effectively deputizes police to probe deeply into people’s homes using other tools that amplify human senses. That principle erodes privacy at its very core.
 
Where Does Privacy Begin in Shared Spaces?

But the Court doesn’t have to rely on exotic technology to analyze this case. In Florida v. Jardines (2013), the Court held that a dog sniff at a home did constitute a Fourth Amendment search because officers intruded on the home’s “curtilage” – the property around a house in which a resident has a reasonable expectation of privacy.
 
Lower courts have split over what curtilage means when the home is an apartment door in a shared hallway. Some have said that common hallways aren’t curtilage because tenants lack exclusive control over them. The Fourth Circuit took exactly that view in Johnson. But that leaves the vast number of American apartment dwellers as second-class citizens when it comes to Fourth Amendment protection of their homes.
 
Modern Homes and Founding-Era Privacy

We also reminded the Court that the modern apartment poses nothing new to the Fourth Amendment. When the Fourth Amendment was submitted to the states in 1789, cities from Charleston to Boston were already thick with rowhouses and boarding houses. The Framers, well acquainted with multi-unit dwellings, would have understood that a person’s “house” could be part of a larger structure when they drafted the Fourth Amendment. 
 
Thus, there is nothing in the Fourth Amendment’s text or original meaning that suggests that privacy protections disappear simply because a dwelling shares walls or hallways with others.
 
A New Governing Principle for Sense-Enhancing Tech?

Most important of all, if the Fourth Circuit’s rule stands – that drug-sniffing canines can be deployed at an apartment door without a warrant – it will endorse a new governing principle: that law enforcement can use any sense-enhancing device or technique to probe inside homes.
 
Warrantless, deep-privacy intrusions under this rule could become ubiquitous. GPS, olfactory sensors, bioengineered animals, and other emerging tools could become routine justifications for ignoring warrants. The principle could allow supposedly narrow-searching devices atop every roving police car or on every street corner to scan all passersby.
 
“Surely,” we told the Court, “the Founders’ expectation of privacy would not allow such a dystopian outcome.” That is why we are asking the Court to reassert the historical and textual guardrails of the Fourth Amendment in this case.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR FOURTH AMENDMENT RIGHTS

Watching the Watchers: Amazon’s Ring Superbowl Commercial Demonstrates “Terrifying” Surveillance

2/10/2026

 
Watch Amazon’s Super Bowl ad and tell us what you see: a heartwarming story of a family reunited with a lost dog, or another element in America’s comprehensive surveillance state.

As the ad shows, Amazon’s free “Search Party” function connects cameras in a whole neighborhood to look out for a lost dog. Amazon’s AI, trained by tens of thousands of dog videos, can recognize different breeds, fur patterns, shapes and sizes to spot the lost puppy. That is not a bad thing at all.

But many viewers found the ad “terrifying,” not heartwarming, according to Kelly Kazek of al.com. One commenter on X wrote:

“Ring just casually outing themselves as literal spyware that can be accessed by anyone on the network. This is insane.”

Another wrote:

“Amazon owns Ring and they want to use all these devices to make a mesh network for Amazon sidewalk … The American consumer just got a Trojan horse packaged as home security.”

As EFF’s Matthew Guariglia reported last year:

“Not only is the company reintroducing new versions of old features which would allow police to request footage directly from Ring users, it is also reintroducing a new feature that would allow police to request live-stream access to people’s home security devices …

“This is a grave threat to civil liberties in the United States. After all, police have used Ring footage to spy on protestors, and obtained footage without a warrant or consent of the user.”

The Search Party AI function greatly amplifies Ring’s surveillance capability. This default feature of Amazon Ring that can identify Fido can also identify you, where you go, and people you visit.
​
At the very least, Amazon should announce limits on how this technology can be trained to follow Americans in our daily movements.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

How the Puppy Poop Police Threaten Our Privacy

2/8/2026

 
Picture
It seems like such a good idea: You lose your dog Ziggy, and you might – but likely won’t – find him by nailing flyers to telephone poles and making social media posts. But with a massive national database of dog photos and a search image function powered by AI, you can save the day.

Another technology to find individual dogs comes from “snout recognition,” the canine version of facial recognition. This tech has dubious origins – blacklisted Chinese AI giant Megvii, which has been developing such canine facial-recognition technology (for snouts of all shapes) since 2019. 

A more common technology links poop to pups through DNA analysis of dog waste. One innovative company, PooPrints, caters to landlords and HOAs desperate to sniff out dog owners who don’t pick up after their pets. No joke: If you want to live at a swanky condominium along the Hudson in New Jersey, for example, you may be required to have your dog’s DNA swabbed and put on file. (If it can happen in Italy, it can happen here).

But there’s a flip side to these otherwise noble uses of detection/recognition technology – this isn’t really just about our pets. Though well-intentioned, these methodologies can be leveraged as yet another way to bypass our privacy expectations.

At least one published study recounts how canine DNA was used to convict four men of murder. All it took was a crime tip from a caller and some residual dog poop from the scene found on one of the perpetrator’s shoes. All other evidence was inconclusive, but the DNA analysis showed the odds of the sample coming from a different dog other than the one at the crime was 1 in 1.16 billion.

We’re all for analyzing DNA and snouts to solve such criminal cases, as the Fourth Amendment clearly permits. What’s concerning is the cavalier way in which something as deeply and uniquely ours as DNA – and now that of our pets – can be gathered, stored indefinitely, and misused without permission or legitimate purpose. Just add human and canine DNA to the thousands of other data points already purchased and warrantlessly accessed by federal agencies and stolen by bad actors. It's just one more knot in the ever-tightening surveillance net that surrounds us.

Remember that the next time you wonder whether to pick up Ziggy’s contribution at the dog park.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Josh Hawley and Adam Schiff Ask Tough Questions on Government Surveillance

2/7/2026

 
Picture
Senators Josh Hawley (Left) and Adam Schiff (Right)
​The recent Senate Judiciary Committee hearing on the “review and reform” of the Foreign Intelligence Surveillance Act (FISA) yielded some fireworks and surprises that herald a robust and rowdy debate to come.

One FISA authority, Section 702, is due to sunset in April. As the Section 702 renewal debate heats up, that authority – enacted by Congress to enable spying on foreign targets on foreign soil without the need for a warrant or court order – will come under intense scrutiny for being used by the government in recent years to warrantlessly access millions of Americans’ private communications.

But a host of other surveillance authorities will also be debated. Liza Goitein of the Brennan Center for Justice told the committee:

“Section 702 is part of an ecosystem of often overlapping surveillance authorities, and when one avenue is closed off, the government can often turn to another or exploit gaps in that network to conduct surveillance with no statutory authority at all.”

One of these gaps is the “data broker loophole.” This is the routine practice of multiple federal law enforcement and intelligence agencies – including the FBI, the IRS, and the Department of Homeland Security – purchasing Americans’ private digital data from data brokers. Once purchased, agencies assert a right to examine Americans’ data without a warrant.

Adam Schiff’s Tough Questions About the Data Broker Loophole

In the hearing, Sen. Adam Schiff (D-CA) asked Goitein (see the 1:30 mark) about how “law enforcement and intelligence agencies might circumvent the requirements of the Fourth Amendment by acquiring information from third-party data brokers.”

Sen. Schiff highlighted the disingenuousness of the intelligence community and its workaround for the Electronic Communications Privacy Act, which prohibits direct sales of Americans’ personal data by telecoms to government agencies. But telecoms are allowed sell Americans’ personal information to data brokers for commercial purposes. Federal agencies exploit this loophole by claiming that there is nothing to prevent them from also purchasing Americans’ data from those brokers.

Liza Goitein made it clear that such “gaps” in the surveillance “ecosystem” should be very much a part of the Section 702 debate. “And the gap I am most worried about is this data broker loophole. Federal agencies are buying their way around constitutional and statutory requirements on a routine basis.”

Sen. Josh Hawley took a different tack, focusing on a contradiction in the government’s lenient definition of what qualifies as a search of an American’s communication.

Josh Hawley Schools Surveillance Advocate

“You said that Section 702 cannot be used to target Americans,” Sen. Josh Hawley (R-MO) (see the 1:27 mark) said to Adam Klein, Director of the Strauss Center at the University of Texas at Austin. “But that’s cold comfort, isn’t it” he said, “to those subject to 278,000 improper searches – United States persons that we were talking about – in 2022 alone?”

“I mean, sure, the statute doesn’t permit them to be targeted, but when they have their personal information directly queried or improperly searched, what’s the difference?”

Klein responded that Americans should take comfort from the fact that Section 702 is meant to target foreigners overseas, not Americans.

Hawley fired back:

“As someone who had his cellphone tapped, improperly, by the United States government, by the way, why would I feel any better if I am told, ‘the U.S. government improperly queried your personal information … but don’t worry, they weren’t going after you, in the first instance. They just happened to have all of your stuff and then they look into them because there are no effective constraints on them. Why is that a good thing?”

Klein pivoted to the issue of surveillance of Members of Congress, whom he said had “a heightened expectation of safeguards in this area.”

Hawley cut him off to ask why this expectation doesn’t also protect journalists or Americans who merely travel overseas or have family overseas. Hawley said the government effectively says, “Oh, don’t worry, you weren’t targeted. I mean you were effectively targeted.”

Sen. Hawley highlighted the contradiction in how the search of an American person’s data is not treated as a separate Fourth Amendment event. On one hand, Hawley said, the government promises not to target Americans. On the other hand, it searches Americans’ data.

“You can’t have it both ways,” Sen. Hawley said, adding, “That looks an awful lot like a search and seizure under Fourth Amendment.”
​

When Sens. Hawley and Schiff – at opposite ends of the political spectrum – pose such tough questions, it is clear that the emerging bipartisan surveillance debate in Congress is going to heat up.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Drone Surveillance Now Covers 99.6 Percent of Homes in America

2/6/2026

 
Picture
​Look up. There is a good chance a drone is looking back.
 
From government agencies to insurance companies, drones now routinely patrol American neighborhoods, hovering over backyards and rooftops in search of violations, liabilities, and profit. What was once pitched as a tool for emergencies or remote inspections has quietly become a pervasive system of aerial surveillance of American homes without public consent.
 
In Virginia, under current law, surveillance drones may conduct close inspections of private property without a warrant in emergency or “exigent” circumstances. These exceptions include searches for a missing child or an elderly person who has wandered off, or tracking a dangerous suspect on the run.
 
Now a bill introduced in Virginia’s lower chamber by Alfonso Lopez, a Democratic member of the House of Delegates, would expand this list of emergency exceptions in which the Fourth Amendment’s requirement for a probable-cause warrant can be swept aside. If this bill passes, the Commonwealth of Virginia will be able to spy on citizens to make sure they follow environmental rules on sediment control and erosion management, as well as regulations regarding water and wetlands.
 
In short, this bill would allow the Virginia Department of Environmental Quality to deploy surveillance drones not for the usual dire exigent circumstances, but just to make sure that property owners are in compliance with that department’s environmental regulations.

Virginia’s proposal shows how easily “emergency” drone powers can be repurposed for routine regulatory enforcement. But government is not the only actor exploiting the skies. As drone surveillance becomes normalized, private companies have eagerly followed – deploying the same technology not to enforce the law, but to grow profits.
​

Texas provides one example of how the private sector is using drones to impinge on homeowners’ privacy. KUT News in Austin interviewed dozens of homeowners, industry experts, and insurance watchdogs, and reviewed hundreds of pages of complaints and state filings, to document how insurance companies are using aerial drone technology to spy on their customers.
 
KUT reports that poor images of homes often prompt insurance providers to unfairly raise rates or cancel policies. Customers have been told to replace their roofs when in fact their roofs only need a good cleansing rain. As Audrey McGlinchy of KUT writes: “And with the proverbial click of a button, companies can decide if they want to renew a homeowner’s policy.”
 
How pervasive is commercial surveillance? KUT reports that one aerial-imaging technology firm providing imagery for insurance companies estimates there are “eyes on 99.6 percent of the country’s population.”
 
State laws and courts are not adjusting to this new reality. For example, in 2024 the Michigan Supreme Court punted on the Fourth Amendment implications of a township’s low-flying drone that crossed over a couple’s fence line to search for zoning violations. At the national level, the U.S. Supreme Court has yet to fully define drone-specific privacy rights. 
 
Lawmakers and courts need to catch up to a simple reality – pervasive drone surveillance over homes is no longer hypothetical, rare, or futuristic. It is routine, largely unregulated, and already being used to punish Americans financially and intrude on their privacy. If the Fourth Amendment is to mean anything in this age of mass aerial surveillance, our laws must recognize that what hovers over our roofs and backyards today can be just as invasive as a warrantless step into our homes.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

Former NSA General Counsel Tells Senate to Bug Out on Oversight

2/4/2026

 
Picture
​Stewart Baker, former general counsel of the National Security Agency, opened his testimony before the Senate Judiciary Committee last week with a startling, if somewhat insolent, proposal.

Baker’s proposal came at the beginning of that hearing on the “review and reform of the Foreign Intelligence Surveillance Act,” which centered around FISA Section 702. This is an authority enacted by Congress to enable spying on foreign targets on foreign soil without the need for a warrant or court order. Yet it has been used in recent years to enable warrantless government access to millions of Americans’ private communications.

Section 702 sunsets in April 2026 after the last reauthorization in April 2024. The reauthorization debate now beginning on Capitol Hill is being used to explore not just Section 702, but many other surveillance authorities associated with it as well.

“It’s time to say – let’s stop putting a sunset on 702,” Baker said. “It is only putting our most valuable security tool up for grabs every couple of years and then praying that there is enough bipartisan spirit in the Congress to do what needs to be done.”

This flew in the face of remarks by Chairman Chuck Grassley (R-IA) and Ranking Member Dick Durbin (D-IL).

Sen. Grassley said that while Section 702 is an “essential national security and intelligence tool,” he believes that “constant Congressional oversight and vigilance is also essential to ensure that this authority is exercised responsibly.”

The chairman also expressed concern about FISA’s “reach” and said there is “still more work to be done.”

To underscore this point, Sen. Grassley reminded the committee that he and Sen. Durbin have complained that an oversight measure passed into law in 2024 is being blocked by the Department of Justice.

That law allows senators and staff members with high levels of security clearance to attend hearings of the Foreign Intelligence Surveillance courts. But an onerous set of restrictions imposed by the Justice Department under the Biden administration and continued by the Trump administration has made it impossible for Members of Congress to attend the hearings with staff – or even to discuss them with anyone, whether cleared staff or other senators.

That is not a guardrail. It is a gag order.

The Justice Department also asserts a right to remove senators and Members of Congress at will. This is peculiar, given that the right to remove people from a courtroom is normally exercised by the presiding judge, not a functionary from the executive branch.

Ranking Member Sen. Durbin echoed the chairman on their “responsibility to conduct oversight” of Section 702. “For years the government has used it as a domestic spying tool to collect millions, maybe billions, of Americans’ private communications.”

Sen. Durbin added that the government has been:

“Reading our text messages and emails, and listening to our phone calls, all without a warrant requirement of the Fourth Amendment … Section 702 has been abused to spy on business and religious leaders, political parties, Members of Congress, campaign donors, journalists, and political protesters of all stripes.”

The intelligence community has long played clever word games with Section 702 to enable such warrantless domestic spying. And when federal agencies are called out on their domestic spying, more often than not they fail to respond to their putative overseers on the Hill or to innumerable Freedom of Information Act (FOIA) requests filed by PPSA and other civil liberties organizations.

Consider the letter of protest Sen. Grassley and Sen. Durbin sent in November to Attorney General Pam Bondi asking her to stop those executive branch restrictions on congressional oversight at the FISA court hearings.

Three months have passed and Attorney General Bondi has yet to respond to the Chairman and Ranking Member of the Senate Judiciary Committee.
​
Could we have a better example of why senators believe Congress must use sunsetting and other robust measures to try to compel oversight of an intelligence community that refuses to answer even basic questions?

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS

This App Knows Users’ Masturbation Habits – and Now So Can the World

2/2/2026

 
Picture
​One of the unintended consequences of living in the digital age is that everything, sooner or later, becomes quantified as a data point. That now includes – insert “Rated R” warning here – an app user’s masturbation frequency. (Exercising great discipline, we will resist the temptation to make tasteless puns throughout this piece, though they practically write themselves. So, use your imagination.)

Back to the story – addictions of many sorts are as old as humanity. If there’s a silver lining to the otherwise debatable benefits of social media, it may be the proliferation of apps now claiming to offer support for those who seek to overcome their habits. That includes the category of sexual addiction to pornography and masturbation.

404 Media, which originally broke the story, says that an app devoted to helping users defeat their porn addiction is inadvertently sharing related data. This includes how often users look at porn, how they respond, and how it makes them feel when they do. 404 says the story is “a good reminder to think twice before giving any app your personal information.”

The data also includes the users’ age. 404 Media’s reporting suggests that many of the affected users described themselves as minors – as many as 100,000 of the 600,000 whose records proved to be accessible. These vulnerabilities were apparently first reported to the app maker by an independent security researcher in September.

To date, however, the company has not resolved the issue. In fact, its founder has dismissed the allegations as “a bit of a joke,” suggesting the potential for a data leak was faked. For privacy reasons, 404 isn’t naming the app. The root cause of this vulnerability is a long-understood flaw in Google Firebase, which is used by developers to build apps. This flaw is therefore easily replicated by experts. In other words, it’s no joke.

The report indicates that for reasons unknown, Google itself hasn’t fixed the issue. But it’s even more curious that all app makers and even app marketplaces – in whose trust users place their data – haven’t done so, either. All of which means that when it comes to data security, an entry made in confidence can amount to global oversharing.

"The data they can get on what motivates you, what actually makes you take an action – that's so valuable," says technology journalist Elaine Burke. “This is [about] so much more than what your browsing habits are and what you're interested in.” She warns that developers are sold on the notion that humans are “mathematical problems that can be solved with the right metric.”

This story points to the larger issue of falsely believing that when it comes to defeating age-old personal issues in the 21st century, it’s as simple as thinking there’s an app for that. That impulse leads many to unknowingly risk their most personal data with the tap of a digital button. The promise is self-control. But the price might be a loss of privacy.

This demolition of personal privacy by datapoint is made worse by the regular practice of a dozen federal agencies – ranging from the FBI to the IRS – to purchase Americans’ private digital information from data brokers and review it at will. That is all the more reason for Congress to pass a law that imposes a probable-cause warrant requirement before agencies can inspect Americans’ most private information.
​
In the meantime, practice caveat venditor: seller beware – especially when the product is you.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US DEFEND YOUR FOURTH AMENDMENT RIGHTS
<<Previous

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Biometric Data
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Data Privacy
    Digital Privacy
    Domestic Surveillance
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    The White House
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2026. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank