Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

YouGov Poll: It’s Okay To Spy On Politicians

6/30/2025

 
Picture
Data journalist Jamie Ballard reported on a recent YouGov poll entitled “Privacy and Government Surveillance.” The conceptual divide among respondents appears to be whether someone is regarded as a public or a private figure. Clear majorities condoned monitoring the online activities of the following groups, who are either public by nature or of public concern:

  • Suspected terrorists: 84 percent
  • Citizens of hostile countries: 68 percent
  • Politicians: 66 percent
  • Illegal immigrants: 65 percent
  • Government workers: 57 percent

To be clear, the survey question didn’t ask if government workers and politicians should be surveilled because they are under sufficient suspicion of a crime to justify a probable cause warrant. Nope, they are spy-worthy simply by virtue of being public figures.

Those polled do believe that private citizens should be afforded more protection, with majorities agreeing that an ongoing criminal investigation is required in order to justify monitoring someone’s digital activity.

Still, what made us do a spit take is why so many people deem it acceptable for federal spy agencies to surveil the online activities of a president, or Members of Congress, or governors, at will. This seems at odds with another finding in the same poll, that 71 percent of Americans are concerned that surveillance powers could be used by the U.S. government to target political opponents or suppress dissent.

So what’s going on? It may partially reflect widespread disillusionment with leaders in Washington, D.C. and many states. But that doesn’t come close to explaining the reasons behind this response.

PPSA and others not only oppose warrantless surveillance of politicians, we advocate for enhanced guardrails when it comes to legal surveillance of political candidates and elected officials. We believe those protections should be extended to journalists as well.

This is not because politicians and journalists are special people with special rights, by any means. The reason is more profound than that. When a politician or a journalist is targeted, that act necessarily involves the political and speech rights of the many Americans who voted for that officeholder or who follow that journalist. Monitoring of the online activity of politicians and journalists is an attack on a free political system itself.

Such were the grievous wrongs when the FBI investigated Donald Trump in 2016 on allegations the Bureau itself knew were disproven, and when the executive branch secretly pulled communications of Members of Congress and aides of both parties in 2017. Republicans and Democrats both had reason to be alarmed. Our intelligence agencies have a history of secretly overseeing their overseers.

Perhaps this one result in the YouGov poll is just an outlier. But it merits our attention. Americans need to appreciate that underhanded surveillance of politicians is actually an attack on them.
​
Civil libertarians clearly have a lot of work to do in the realm of public education.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

How To Build A Surveillance State Without Really Trying: Naïve Magistrate Declares “Privacy In Our Time”

6/30/2025

 
Picture
If you wanted to build a mass surveillance program capable of monitoring 800 million people, where would you start? Ars Technica’s Ashley Belanger found the answer: You order OpenAI’s ChatGPT to indefinitely maintain all of its regular customer chat logs, upending the company’s solemn promise of confidentiality for customers’ prompts and chats.

Which is what Ona Wang, U.S. Magistrate Judge for the Southern District of New York, did on May 13. From that date forward, OpenAI has had to keep everything – even users’ deleted chats. All of the rest is now stored “just in case” it’s needed someday.

We asked ChatGPT about this, and it told us:

  • Yes, your current chat questions (and past ones you may have deleted or used in “temporary mode”) are being retained in a secure, segregated legal-hold system.

So our lives – health, financial, and professional secrets – are now being stored in AI chats that Judge Wang thinks should be kept on file for any warrant or subpoena, not to mention any Russian or Chinese hacker.

Not included in the judge’s order are ChatGPT Enterprise (used by businesses) and Edu data (used by universities). Problem: Many businesses and students use regular ChatGPT without being Enterprise subscribers, including entrepreneur Jason Bramble. He asked the judge to consider the impact of her ruling on – well, you name it – his company’s proprietary workflows, confidential information, trade secrets, competitive strategies, intellectual property, client data, patent applications, trademark requests, source code, and more.

  • Perhaps the greatest irony of the judge’s order is that it decimates the privacy-focused “Temporary Chats” feature OpenAI recently debuted. They are “temporary” no longer. Originally, those chats were designed to vanish once you closed them, nor were they part of the user’s account history or memory. They were meant to be secret, one-off conversations with no record. Now, they are digitally accessible memories.

As for the underlying case giving rise to all of this overreach, it involves a copyright infringement lawsuit between OpenAI and the New York Times. It’s a big case, to be sure, but no one saw this coming except for Jason Bramble and one other ChatGPT user, Aidan Hunt.

Hunt had learned about the judge’s order from a Reddit forum and decided it was worth fighting on principle. In his motion, he asked the court to vacate the order or at least modify it to exclude highly personal/private content. He politely suggested that Judge Wang was overstepping her bounds because the case “involves important, novel constitutional questions about the privacy rights incident to artificial intelligence usage – a rapidly developing area of law – and the ability of a magistrate to institute a nationwide mass surveillance program by means of a discovery order in a civil case.”

Judge Wang’s response was petulant.

She noted that Hunt mistakenly used incident when he meant incidental. And then she casually torpedoed two hundred years of judicial review by denying his request with this line: “The judiciary is not a law enforcement agency.” Because, after all, when have judicial decisions ever had executive branch consequences?

Judge Wang had denied business owner Jason Bramble’s earlier request on the grounds that he hadn’t hired a lawyer to draft the filing. The magistrate is swatting at flies while asking ChatGPT users to swallow the herd of camels she’s unleashed. Even if a properly narrowed legal hold to preserve evidence relevant to The New York Times’ copyright infringement claim would be appropriate, the judge massively overstepped in ordering ChatGPT to preserve global chat histories. 

The complaints of Bramble and Hunt, as well as similar pleadings from OpenAI, aim true: The court’s uninformed, over-reaching perspectives ignore the pressing realities of pervasive surveillance of those who accepted the promise that their conversations with ChatGPT were truly private.

Judge Wang wondered Hamlet-like whether the data could be anonymized in order to protect users’ privacy. As we’ve written before, and is now commonly understood, government and hackers have the power to deanonymize anonymized data. As MSN points out, the more personal a conversation is, the easier it becomes to identify the user behind it.

In declaring that her order is merely about preservation rather than disclosure, Judge Wang is naively declaring “privacy in our time.” As in 1938, we stand at the edge of a gathering storm – this time, not a storm of steel, but of data.

What can you do? At the least, you can start minding your Ps and Qs – your prompts and questions. And take to heart that “delete” doesn’t mean what it used to, either.

Here's a chronology of Ashley Belanger’s detailed reporting on this story for Ars Technica:
​
June 4
June 6
June 23
June 25

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

The FBI and MLK: What We Need To Know Isn’t in The Files, It’s How the Files Came to Be

6/27/2025

 
Picture
J. Edgar Hoover’s FBI was famously obsessed with Dr. Martin Luther King, Jr., convinced King was a communist pawn (largely due to his association with left-wing civil rights activist Stanley Levison). The irony, of course, is that if Moscow actually was predicating its hopes of dividing America by supporting Dr. King, then any backdoor support for King’s cause from communists ultimately amounted to one of the greatest own goals in history.

Dr. King’s approach that married Christian love to hardnosed political tactics might well have prevented a race war. That approach certainly helped to transform the heart of America.

Then, as now, law enforcement overreach was premised on “national security.” But the motivation behind the FBI’s surveillance of King soon revealed itself to be a character assassination campaign centered on his sex life – salacious, personal, harassing, and utterly invasive.

Hoover made sure that King’s personal foibles were sent to Lyndon Johnson’s White House, Members of Congress, the AP, UPI, and Coretta Scott King herself. Hoover even held a press conference denouncing King as “the most notorious liar in the country.” And yet few deemed the information newsworthy. Americans instinctively realized that one’s private life is exactly that, undeserving of the indignity of unauthorized surveillance and the terror of state-sanctioned moral harassment.

Now those files, sequestered for nearly half a century, are under review to be released as part of Trump’s Executive Order 14176 on the 1960s trinity of assassinations, those of the Kennedys and King.

So here’s the truth about releasing the contents of King’s classified files: Nobody wins. What is of real value is not what the FBI learned, but exactly how and why the FBI invaded King’s privacy.

Consider what we already know: Hoover’s appalling and thoroughly discredited COINTELPRO program included policing morality. To quote the Church Committee’s 1976 report, the program aimed not only to protect national security, but to maintain the “existing social and political order.” Nowhere in the U.S. Code will you find Congress tasking the FBI with upholding its idea of what society should look like.

“No holds were barred,” lamented COINTELPRO chief William Sullivan in his posthumous memoir. He recalled that Hoover’s team saw King as a demagogue and “the most dangerous Negro of the future in this nation” after his history-making speech on the National Mall.

  • U.S. District Judge Richard Leon should keep these facts in mind when he soon begins reviewing arguments for and against unsealing King’s FBI files. Since a judicial order in 1977 they have been held at the National Archives and Records Administration, with instructions to remain under wraps until January 2027 unless shortened or extended by a federal court.
 
  • “It is unquestionable that my father was a private citizen, not an elected official, who enjoyed the right to privacy that should be afforded to all private citizens of this country,” King’s youngest daughter, Bernice, told Judge Leon’s court in a filing. “To not only be unjustifiably surveilled, but to have the purported surveillance files made public would be a travesty of justice.”

We don’t need to know what Dr. King did in his private life. We need to know what the FBI did, under what legal guidance it acted (assuming there was any), why it happened, and what could have prevented it.
​
The FBI must never again engage in this kind of politically motivated violation of an American’s privacy. For an era in which surveilling Americans is now so easy, the FBI’s misbehavior is the only worthwhile remaining part of the King story that must be told.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

The CLOUD Act Raises Bipartisan Hackles

6/18/2025

 

Hearing Evokes Unprompted, Strong Endorsement of a Warrant Requirement for Section 702

Picture
The CLOUD Act of 2018 is a framework for working with U.S. tech companies to share digital data with other governments. This law and basis for international agreements was a reasonable concession to allow these companies to do business around the world. But the agreement has gone off the rails because of the United Kingdom’s astonishing attempt to force Apple to break end-to-end encryption so they can access the data of all Apple users stored in the cloud.

Rather than violate the privacy of its users, Apple has stood by its customers and withdrawn encrypted iCloud storage from the UK altogether.

The House Judiciary’s Subcommittee on Crime and Federal Government Surveillance was already skeptical about that agreement, but appalled when the British government used it to secretly order Apple to provide that unfettered, backdoor access to all the cloud content uploaded by every Apple user on the planet. It was an unprecedented request, and an unexpected one from a fellow democracy.

  • In the two years the agreement has been in effect, the UK issued more than 20,000 requests to U.S. service providers. The bulk of those requests included wiretapping surveillance.
 
  • In comparison, the United States issued a mere 63 requests to British providers, mostly for stored data.
 
  • Compare the UK’s 20,000 requests to the 4,507 wiretap orders of U.S. federal and state law enforcement agencies in criminal cases in two years. The United States has five times the population of the U.K., but only issues about one-fourth the number of such orders.

In April, members of the House Judiciary Committee asked Attorney General Pam Bondi to terminate the U.K. agreement. As extreme as that sounds, PPSA supports that proposal as the best way to persuade Britain to back off an unreasonable position. In the worst-case scenario, no agreement would be better than comprehensive violation of Americans’ privacy.
Undeterred, the subcommittee convened a recent hearing entitled “Foreign Influence On Americans’ Data Through The CLOUD Act.” Greg Nojeim from the Center for Democracy & Technology was an invited witness. If one had to name a single theme to his powerful testimony, it would come down to one word: “dangerous.”

Subcommittee Chairman Andy Biggs used the same word, declaring the secretive British demand of Apple “sets a dangerous precedent and if not stopped now could lead to future orders by other countries.” Ranking Judiciary Committee Member Jamie Raskin struck a similar chord: “Forcing companies to circumvent their own encrypted services in the name of security is the beginning of a dangerous, slippery slope.”

In short, the hearing demonstrated that the CLOUD Act has been abused by a foreign government that does not respect privacy and civil liberties or anything remotely like the Fourth Amendment to our Constitution. It needs serious new guardrails, beginning with new rules to address its failure to protect encryption. Expert witness Susan Landau of Tufts University warned the subcommittee that the U.K. appeared to be undermining encryption as a concept. A U.S.-led coalition of international intelligence agencies, she observed, recently called for maximizing the use of encryption to the point of making it a foundational feature of cybersecurity. Yet Britain conspicuously demurred.

  • Rep. Biggs said: “Efforts to weaken, or even breaking, encryption makes us all less secure. The U.S.-U.K. relationship must be built on trust. If the U.K. is trying to undermine this foundation of cybersecurity, it is breaching that trust.” Once pried opened, he cautioned, “It's impossible to limit a back door [around encryption] to just the good guys.”
 
  • Rep. Raskin warned that issues with the CLOUD Act itself are emblematic of larger privacy issues. “None of these issues exists in a vacuum. All government surveillance curtails all citizens’ liberties.” To which witness Richard Salgado added, “If there's still a real debate about whether security should yield to government surveillance, it doesn't belong behind closed doors in a foreign country … the debate belongs in public before the United States Congress.”

That debate will likely become intense between now and next spring when Congress takes up the reauthorization of Section 702 of FISA, the Foreign Intelligence Surveillance Act. Judiciary Chairman Jim Jordan indicated as much when he used his opening remarks to tout the “good work” the Committee has ahead of it in preparing to evaluate and reform Section 702.

Later in the hearing, Chairman Jordan returned to the looming importance of the Section 702 debate, asking each of the witnesses in turn a version of the question, “Should the United States government have to get a warrant before they search the 702 database on an American?”

All agreed without hesitation.

“Wow!” declared Rep. Jordan in response. “This is amazing! We all think we should follow the Constitution and require a warrant if you're going to go search Americans’ data.”
​

Rep. Raskin nodded along. And that’s as bipartisan as it gets.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Watching the Watchers: Sen. Paul on Open Skies

6/10/2025

 
Picture
Sen. Rand Paul (R-KY) celebrated the termination of the “Quiet Skies” surveillance program in which U.S. Marshals posed as airline passengers to shadow targets.

This $200 million a year program did not, according to the Department of Homeland Security, stop a single terrorist attack. But, in the words of Sen. Paul in The American Conservative, it “was an unconstitutional dystopian nightmare.” Sen. Paul writes:

“According to Department of Homeland Security documents I obtained, former Congresswoman and now Director of National Intelligence Tulsi Gabbard was surveilled under the program while flying domestically in 2024. Federal Air Marshals were assigned to monitor Gabbard and report back on their observations including her appearance, whether she used electronics, and whether she seemed ‘abnormally aware’ of her surroundings. She wasn’t suspected of terrorism. She wasn’t flagged by law enforcement. Her only crime was being a vocal critic of the administration. What an insanely invasive program – the gall of Big Brother actually spying on a former congresswoman. It’s an outrageous abuse of power … 

“And perhaps the most absurd of all, the wife of a Federal Air Marshal was labeled a ‘domestic terrorist’ after attending a political rally. She had a documented disability and no criminal record. Still, she was placed under Special Mission Coverage and tracked on commercial flights – even when accompanied by her husband, who is himself a trained federal law enforcement officer. She remained on the watchlist for more than three years. To make matters worse, this case resulted in the diversion of an Air Marshal from a high-risk international mission ...
​

“Liberty and security are not mutually exclusive. When government hides behind secrecy to justify surveillance of its own people, it has gone too far."

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Newly Released FBI Documents Reveal the National Extent of the Bureau’s Targeting of “Radical Traditionalist Catholics”

6/9/2025

 
Picture
In the intelligence business, “tradecraft” is the professional use of techniques, methods, and technologies to evaluate a purported threat. When an official finding is made that a threat assessment memo lacks tradecraft standards, that is a hard knock on the substance of the memo and the agent who wrote it.
 
Thanks to the efforts of Sen. Chuck Grassley (R-IA) and the forthcoming response from FBI Director Kash Patel, we now know that the infamous memo from the Richmond, Virginia, field office targeting “radical traditional Catholics” was riddled with conceptional errors and sloppy assumptions. In the FBI’s own judgment, it showed poor tradecraft. Worse, the impact of this assessment of traditional Catholics was rooted in smears from the Southern Poverty Law Center (SPLC), which Sen. Grassley correctly calls “thoroughly discredited and biased.”
 
Contrary to dismissive statements from former FBI Director Christopher Wray, this memo wasn’t the product of one field office. In its preparation, the Richmond, Virginia, field office consulted with Bureau offices in Louisville, Portland, and Milwaukee to paint Catholics who adhere to “conservative family values/roles” as being as dangerous as Islamist jihadists. Sen. Grassley’s document reveal also shows that there were similar efforts in recent years in Los Angeles and Indianapolis.
 
This memo was not a mere thought experiment. It was a predicate for surveillance. Among the activities we know about that resulted from this memo were attempts to develop a priest and a choir director into FBI informants on parishioners.
 
Sen. Grassley also produced a memo from Tonya Ugoretz, FBI Assistant Director, Directorate of Intelligence, acknowledging that the Southern Poverty Law Center’s (SPLC) list of hate groups – and lack of explanation for its threshold in slapping such a label on organizations and people – went unexamined by this Richmond memo. Yet that original memo from the Richmond field office found SPLC as a trustworthy source to assert that there will be a “likely increase” in threats from “radical traditional Catholics” in combination with “racially and ethnically-motivated violent extremism.”
 
Another memo produced by Sen. Grassley reveals the conclusion of the FBI’s Directorate of Intelligence: “The SPLC has a history of having to issue apologies and retract groups and individuals they have identified as being extremist or hate groups.” Now Sen. Grassley and Sen. James Lankford (R-OK) are appealing to the FBI to direct field offices not to rely on the characterizations of the SPLC.
 
This whole episode should serve as a reminder that merely opening an investigation of a religious group for its First Amendment-protected speech is a punishment in itself, at best violating practitioners’ privacy; at worst, incurring huge legal costs and anxiety.
 
Sen. Grassley deserves the gratitude of the surveillance-reform community for bringing to light the extent to which the FBI allowed America’s culture wars to become a predicate for suspicion of law-abiding Americans.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Big Brother Has A New Name: Executive Order 14243

6/5/2025

 
Picture
HBO’s hit series Westworld wasn’t actually about replicating the old West, but a cautionary tale about the new frontier of artificial intelligence.

It didn’t end well. For the humans, that is. The third season’s big reveal was a sinister-looking AI sphere the size of a building, called Rehoboam. It was shaped like a globe for a very good reason – it determined the destinies of every person in the world. It predicted and manipulated human behavior and life paths by analyzing massive amounts of personal data – effectively controlling society by assigning roles, careers, and even relationships to people, all in the name of preserving order.

The American government – yes, you read that correctly – America, not China, is plotting to build its own version of Rehoboam. Its brain trust will be Palantir, the AI power player recently called out in the Daily Beast with the headline, “The Most Terrifying Company in America Is Probably One You’ve Never Heard Of.”

In March of this year, President Trump issued Executive Order 14243: “Stopping Waste, Fraud, and Abuse by Eliminating Information Silos.” The outcome will be a single database containing complete electronic profiles of every soul in the United States. And all of it is likely to be powered by Palantir’s impenetrable, proprietary AI algorithms.

Reason got to the heart of what’s at stake: an AI database on such a massive scale is only nominally about current issues such as tracking illegal immigrants. It’s really about the government’s ability to profile anyone, anytime, for any purpose.

With a billion dollars in current federal contracts across multiple agencies, Palantir is currently in talks with Social Security and the IRS. Add that to existing agreements with the Departments of Defense, Health and Human Services, Homeland Security, and others. Add to that the Biden administration’s previous contract with Palantir to assist the CDC with vaccine distribution during the pandemic.

While the primary arguments in favor of such an Orwellian construct are commendable-sounding goals like a one-shop stop for efficiency, PPSA and our pro-privacy allies find such thinking – at best – appallingly naïve.

And at worst? There’s an applicable aphorism here: “This is a bad idea because it’s obviously a bad idea.” Let’s not kid ourselves – this is the desire for control laid bare, and its results will not be efficiency, but surveillance and manipulation. It makes sense for Treasury to know your tax status or State to know your citizenship status. But a governmentwide database, accessible without a warrant by innumerable government agents, is potentially the death knell for privacy and the antithesis of freedom.

Think of all the government already knows about you, your family, and friends across multiple federal databases. All this data is about to be mobilized into one single, easily searchable database, containing everything from disability status and Social Security payments to personal bank account numbers and student debt records to health history and tax filings – plus other innumerable and deeply personal datapoints ad infinitum.

Simply put, this database will put together enough information to assemble personal dossiers on every American.

It is bad enough to think any U.S. government employee in any agency will have access to all of your data in one central platform. But at least those individuals would theoretically authorized for such access. Not so the Russian and Chinese cyberhackers who’ve already demonstrated the ability to lift U.S. databases in toto.
​
If that ever happens with this database, it will truly become a matter of one-stop shopping.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

PPSA Files Only Amicus in William Case v. State of Montana

6/3/2025

 

How Police “Emergency” Entries into Homes Will Lead to “Emergency” Entry into Phones

Picture
​The U.S. Supreme Court this week granted a petition for review in what will be the first case that the Court has agreed to hear addressing the scope of the Fourth Amendment’s warrant requirement since 2021. The case seeks clarity on whether the so-called “emergency-aid” exception to the Fourth Amendment requires police to have probable cause that an emergency is ongoing.

After police officers learned that William Case of Montana had threatened suicide, they entered his home without a warrant and seized evidence later used to convict him of a felony. Because the officers “were going in to assist him,” they felt unrestrained by the Fourth Amendment’s warrant requirement even though they did not actually believe that he was in any immediate danger since he was attempting to commit suicide at the hands of the police.

This Court had not reaffirmed the sanctity of the home since Caniglia v. Strom (2021), which found that allowing warrantless entry into the home for community caretaking – duties beyond law enforcement or keeping the peace – would have been completely at odds with the privacy expectations and demands of the Framers.

PPSA, which filed the only amicus brief in William Case v. State of Montana, informed the Court that if now upheld, such warrantless intrusion would inevitably lead to warrantless inspection of the very personal information on Americans’ smartphones and other digital devices.

In our brief, PPSA warned the Court of the “diluting effect such a low bar for emergency aid searches” would cause in other contexts – especially regarding digital devices. PPSA told the Court:

“Such devices hold vast amounts of personal information that, historically, would only have been found in the home. Lowering the burden of proof required to justify the warrantless search of the place the Constitution protects most robustly would lead law enforcement and the courts to dilute protections for other, less historically safeguarded areas, such as electronic devices, which would be devastating to the privacy of Americans …

“If the government may enter the home without a warrant based only on a reasonable belief, far short of probable cause, that an emergency exists, the government may treat electronic sources of information the same way, posing an even greater threat to privacy and the ultimate integrity of the Fourth Amendment. The insidious branding almost writes itself: ‘Big Brother’ may be ‘watching you,’ but it’s for your own good!”

PPSA’s brief also made clear the long history of elevated protection of the home in both American law and English common law. By the 17th century it was established law that the agents of the Crown were permitted to intrude on the home only in a narrow set of extreme circumstances, and only when supported by strong evidence of an emergency that corresponds to at least probable cause. PPSA wrote that if the new emergency standard is allowed

“Seemingly benevolent searches would then become an engine for criminal prosecutions even though no warrant was ever obtained, and no probable cause ever existed. The emergency-aid exception would thus become a license for the government to discover criminal activity that – in all other circumstances – would only have been discoverable through a warrant supported by probable cause.”
​

In Caniglia, the Court unanimously restricted the community care exception to the Fourth Amendment. PPSA will report back when the Court holds oral arguments on the emergency-aid exception in Case v. Montana.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Big Brother in the Big Easy

5/26/2025

 
Picture
If we were writing a techno thriller set in modern-day New Orleans, we’d use the catchy title above and include these basic plot points – all of them real:
​
  • A private nonprofit is selling AI-powered facial recognition technology capable of analyzing faces in real time. It uses powerful hardware and software made by a Chinese company called Dahua, banned by the Federal Communications Commission. More than 200 AI-powered cameras are spread around areas of the city considered high crime. The nonprofit is the brainchild of a former NOPD officer who says he built the database using 30,000 faces from mugshots and other publicly available records. But with no transparency or audits, the true nature of the database and its algorithms remain opaque.

  • The cameras are owned by individuals and businesses in addition to the nonprofit, which subsidizes the cost. As a private network, it operates outside the realm of public accountability. The nonprofit operates under the innocuous title “Project NOLA.” It’s funded by donations and other private sources.

  • Perhaps sensing an opportunity to bypass legal requirements for reporting and oversight, the New Orleans PD engages Project NOLA. No city contract. No fees. No legal reviews. In theory at least, Project NOLA does all the work, and the police are simply informed (although they can request footage and ask Project NOLA to look for someone).

  • The system is fast and sophisticated, even capable of handling low-light conditions and poor camera angles (at up to 700 feet away from a target). It is effectively a real-time general surveillance tool, scanning faces on streets for any matches in its database. If it finds one, officers immediately receive alerts via an app. If someone’s face isn’t already in the database, Project NOLA can upload an image and recorded feeds can be searched for the past 30 days, retracing one’s movements.

  • The project runs for two years before the Washington Post exposed the operation through records requests (and the fact that Project NOLA’s owner would sometimes post on Facebook). Police make dozens of arrests in that time, but because Project NOLA is a private operation, there is no way to know what other steps (if any) were taken in pursuit of due process, nor is there any data on potential misidentifications.

  • The entire arrangement appears to run deeply afoul of a New Orleans city ordinance limiting use of facial recognition software to cases involving violent crime. It also completely bypassed the required use of the state’s crime investigation “fusion” center (so named because various law enforcement agencies can collaborate), where experts have to agree that an image matches a potential suspect.

The central crisis of our thriller will surely involve innocent citizens caught up in a dragnet of unbridled police authority, the thwarting of civilian oversight, and a complete disregard for constitutional rights.
 
And the dénouement? We hope it involves NOPD Superintendent Anne Kirkpatrick stepping up and doing what she told the Washington Post: “We’re going to do what the ordinance says and the policies say, and if we find that we’re outside of those things, we’re going to stop it, correct it and get within the boundaries of the ordinance.”
 
Meanwhile, next time you’re on Bourbon Street, wear a Star Wars style cloak that covers your face. And be careful what “establishments” you frequent.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

The FBI's Concerning Move on Surveillance Accountability

5/25/2025

 
Picture
​The FBI’s recent shuttering of its Office of Internal Auditing – a unit formed to oversee compliance with surveillance protocols under Section 702 of the Foreign Intelligence Surveillance Act – should raise alarms in Congress. This office was created in 2020 in response to significant and well-documented abuses of surveillance authority, including improper queries of Americans’ data without warrants. Now, amidst broader structural reorganization, its dissolution risks dismantling a key internal check just as the program it was meant to monitor is up for reauthorization.
 
It might just be a bureaucratic reshuffling. Yet the unit’s functions are being absorbed into the inspection division – a body also tasked with policing agent misconduct and shootings – without clear evidence that the rigorous, daily compliance activities once prioritized will be maintained. Let us hope this move leads to continued oversight, and not a gutting of oversight.
 
Congress should take this as a wake-up call. Internal guardrails inside the intelligence agencies, no matter how earnestly established, are an inherently unreliable substitute for oversight. Agencies like the FBI require external accountability to ensure their immense powers are not misused. The very creation of the auditing office was a tacit admission that prior oversight had failed. That failure was documented in audits revealing widespread misuse of Section 702 queries against Americans, including members of Congress and political protestors.
 
As we wrote in a different context, the Department of Justice recently demonstrated how easily internal policy can be sidestepped. In 2023, the FBI raided the home of journalist Tim Burke, seizing his devices and potentially sensitive journalistic material. This raid occurred despite the DOJ’s then-year-old News Media Policy, which forbade such seizures unless under extreme and clearly justified circumstances.
 
Congress must recognize that internal oversight mechanisms are not enough. What’s needed now is sustained, bipartisan legislative oversight that ensures intelligence agencies operate within the bounds set by law. When compliance offices can be erased with the stroke of a pen and transparency rules are brushed aside without consequence, the only reliable safeguard is direct accountability to the public through its elected representatives.
 
The shuttering of the auditing office, like the mishandling of DOJ media guidelines, highlights an urgent need for reform – not just more promises of internal reform, but structural changes that restore public trust and protect individual rights.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

“Incredibly Juicy Targets” – Sen. Wyden Reveals Surveillance of Senate Phone Lines

5/24/2025

 
Picture
​ Sen. Ron Wyden (D-OR) informed his Senate colleagues Wednesday that “until recently, Senators have been kept in the dark about executive branch surveillance of Senate phones.”
 
AT&T, Verizon, and T-Mobile failed to meet contractual obligations to disclose such surveillance with the Senate Sergeant at Arms. Sen. Wyden wrote in a letter to his colleagues that their campaign and personal phones, on which official business can be conducted under Senate rules, are not covered by this provision. He called these phones “incredibly juicy targets.”
 
Senate Wyden recommended that his colleagues switch their campaign and personal phones to providers willing to make such disclosures.
 
The purpose of such surveillance might be to protect senators from cyber threats and foreign intelligence, but this is far from clear.
 
For example, Sen. Wyden outlined two breaches that occurred last year, one foreign and one domestic. In the Salt Typhoon hack, Chinese intelligence intercepted the communications of specific senators and their senior staff. The other breach came from the U.S. Department of Justice, which conducted a leak investigation by collecting phone records of Senate staff, including national security advisors to leadership, as well as staff from the Intelligence and Judiciary Committees. Democrats and Republicans were targeted in equal numbers. Sen. Wyden wrote:
 
“Together, these incidents highlight the vulnerability of Senate communications to foreign adversaries, but also to surveillance by federal, state, and local law enforcement. Executive branch surveillance poses a significant threat to the Senate’s independence and the foundational principle of separation of powers … This kind of unchecked surveillance can chill critical oversight activities, undermine confidential communications essential for legislative deliberations, and ultimately erode the legislative branch’s co-equal status.”
 
Perhaps we have, as Elvis sang, suspicious minds. But we find it odd that three major telecoms would all fail to meet their disclosure obligations in a contract with the U.S. Senate unless they were encouraged to do so.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Watching the Watchers: Keeping Your Thoughts Private in the Age of Pervasive Surveillance

5/22/2025

 
Picture
Writer Alex Klaushofer reports on a perfectly ordinary development in surveillance – the installation of cameras in the UK’s Sainsbury grocery store chain to ensure that every customer checks every item.
 
This prompted Klaushofer to think back to her experience in Albania, which is still dealing with the psychological toll of its communist past when one in three people in the capital worked for the secret police. She writes in the British Spectator:
 
“The poverty and under-development of Albania thirty years after the collapse of the regime were obvious to me. But I was puzzled by the behavior of some of the Albanians I got to know; there was a guardedness and often an indirect way of talking. Then Ana Stakaj, women’s program manager for the Mary Ward Loreto Foundation, explained the psychological effects of surveillance and it started to make sense.
 
“‘Fear, and poverty and isolation closed the mind, causing it to go in a circle and malfunction,’ she told me. ‘In communism, people were forced even to spy on their brother, and the wife on their husband. So they learned to keep things private and secret, especially thoughts: your thoughts are always secret.’
 
“I wonder whether we’ve learnt the lessons offered by the authoritarian regimes of the last century: or the living lesson provided by China’s tech-authoritarianism. Do we really understand where using all this new technology so freely is taking us?”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Is Your AI Therapist a Mole for the Surveillance State?

5/16/2025

 

“It’s Delusional Not to be Paranoid”

Picture
​With few exceptions, conversations with mental health professionals are protected as privileged (and therefore private) communication.
 
Unless your therapist is a chatbot. In that case, conversations are no more sacrosanct than a web search or any other AI chat log; with a warrant, law enforcement can access them for specific investigations. And of course, agencies like the NSA don’t even feel compelled to bother with the warrant part.
 
And if you think you’re protected by encryption, think again says Adi Robertson in The Verge. Chatting with friends using encrypted apps is one thing. Chatting with an AI on a major platform doesn’t protect you from algorithms that are designed to alert the company to sensitive topics.
 
In the current age of endless fascination with AI, asks Robertson, what would prevent any government agency from redefining what constitutes “sensitive” based on politics alone? Broach the wrong topics with your chatbot therapist and you might discover that someone has leaked your conversation to social media for public shaming. Or perhaps a 4 a.m. knock on the door with a battering ram by the FBI.
 
Chatbots aren’t truly private any more than email is. Recall the conventional wisdom from the 1990s that advised people to think of electronic communication as the equivalent of a postcard. If you wouldn’t want to write something on a postcard for fear of it being discovered, then it shouldn’t go in an email – or in this case, a chat. We would all do well to heed Adi Robertson’s admonition that when it comes to privacy, we have an alarming level of learned helplessness.
 
“The private and personal nature of chatbots makes them a massive, emerging privacy threat … At a certain point, it’s delusional not to be paranoid.”
 
But there’s another key difference between AI therapists and carbon-based ones: AI therapists aren’t real. They are merely a way for profit-driven companies to learn more about us. Yes, Virginia, they’re in it for the money. To quote Zuckerberg himself, “As the personalization loop kicks in and the AI starts to get to know you better and better, that will just be really compelling.” And anyone who thinks compelling isn’t code for profitable in that sentence should consider getting a therapist.
 
A real one.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Watching the Watchers: Why Rep. Luna Wants to Repeal the Patriot Act

5/13/2025

 
Picture
U.S. Congresswoman-elect Anna Paulina Luna speaking with attendees at the 2022 AmericaFest at the Phoenix Convention Center in Phoenix, Arizona. Photo credit: Greg Skidmore
Rep. Anna Paulina Luna (R-FL) recently introduced the American Privacy Restoration Act, which would fully repeal the USA Patriot Act, the surveillance law hurriedly passed in 2001 shortly after the 9/11 attacks. Rep. Luna declared:
 
“For over two decades, rogue actors within our U.S. intelligence agencies have used the Patriot Act to create the most sophisticated, unaccountable surveillance apparatus in the Western world. My legislation will strip the deep state of these tools and protect every American’s Fourth Amendment right against unreasonable searches and seizures. It’s past time to rein in our intelligence agencies and restore the right to privacy. Anyone trying to convince you otherwise is using ‘security’ as an excuse to erode your freedom.”
 
What is so wrong about the Patriot Act? Judge Andrew Napolitano spells it out in a recent piece in The Washington Times. Judge Napolitano writes:
 
“Among the lesser-known holes in the Constitution cut by the Patriot Act in 2001 was the destruction of the ‘wall’ between federal law enforcement and federal spies. The wall was erected in the Federal Intelligence Surveillance Act of 1978, which statutorily limited all federal domestic spying to that which the Foreign Intelligence Surveillance Court authorized.

“The wall was intended to prevent law enforcement from accessing and using data gathered by America’s domestic spying agencies …
 
“In the last year of the Biden Administration, the FBI admitted that during the first Trump Administration, it intentionally used the CIA and the National Security Agency to spy on Americans about whom the FBI was interested but as to whom it had neither probable cause of crime nor even articulable suspicion of criminal behavior …”
 
Even if Rep. Luna’s bill to repeal the Patriot Act does not pass, reform is still possible. Judge Napolitano writes:
 
“With a phone call, President Trump, who was personally victimized by this domestic spying 10 years ago, can stop all domestic spying without search warrants. He can re-erect the wall between spying and law enforcement.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Meta’s AI Chatbot a New Step Toward a Surveillance Society

5/13/2025

 
Picture
​We’re not surprised – and we are sure you are not either – to learn that new tech rollouts from Meta and other Big Tech companies voraciously consume our personal data. This is especially true with new services that rely on artificial intelligence. Unlike traditional software programs, AI requires data – lots and lots of our personal data – to continuously learn and improve.
 
If the use of your data bothers you – and it should – then it’s time to wise up and opt out to the extent possible. Of course, opting out is becoming increasingly difficult to do now that Meta has launched its own AI chatbot to accompany its third-generation smart glasses. Based on reporting from Gizmodo and the Washington Post, here’s what we know so far:

  • Users no longer have the ability to keep voice recordings from being stored on Meta’s servers, where they “may be used to improve AI.”
  • If you don’t want something stored and used by Meta, you have to manually delete it.
  • Undeleted recordings are kept by Meta for one year before expiring.
  • The smartglasses camera is always on unless you manually disable the “Hey Meta” feature.
  • If you somehow manage to save photos and videos captured by your smartglasses only on your phone’s camera roll, then those won’t be uploaded and used for training.
  • By default, Meta’s AI app remembers and stores everything you say in a “Memory” file, so that it can learn more about you (and feed the AI algorithms). Theoretically, the file can be located and deleted. No wonder Meta’s AI Terms of Service says, “Do not share information that you don’t want the AIs to use and retain such as account identifiers, passwords, financial information, or other sensitive information.”
  • Bonus tip: if you happen to know that someone is an Illinois or Texas resident, by using Meta’s products you’ve already implicitly agreed not to upload their image (unless you’re legally authorized to do so).

None of the tech giants is guiltless when it comes to data privacy, but Meta is increasingly the pioneer of privacy compromise. Culture and technology writer John Mac Ghlionn is concerned that Zuckerberg’s new products and policies presage a world of automatic and thoroughgoing surveillance, where we will be constantly spied on by being surrounded by people wearing VR glasses with cameras.
 
Mac Ghlionn writes:
​
“These glasses are not just watching the world. They are interpreting, filtering and rewriting it with the full force of Meta’s algorithms behind the lens. And if you think you’re safe just because you’re not wearing a pair, think again, because the people who wear them will inevitably point them in your direction.
“You will be captured, analyzed and logged, whether you like it or not.”
 
But in the end, unlike illicit government surveillance, most commercial sector incursions on our personal privacy are voluntary by nature. VR glasses have the potential to upend that equation.
 
Online, we can still to some degree reduce our privacy exposure in what we agree to, even if it means parsing those long, hard to understand Terms of Service. It is still your choice what to click on. So, as the Grail Knight told Indiana Jones in The Last Crusade, “Choose wisely.”
 
You should also learn to recognize Meta’s Ray-Bans and their spy eyes.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

It’s Time to Enforce the TikTok Ban

5/12/2025

 
Picture
​Ireland’s Data Protection Commission, acting in its official capacity as an EU privacy guardian, recently fined TikTok $600 million (€530 million) for breaching its data privacy rules. This punishment was meted out after the conclusion of a four-year investigation, so it’s a decision that was not made lightly.
 
None of this surprises us. We have previously reported on the surveillance issues related to TikTok as well as other Chinese-owned concerns. It’s naïve to think that any software of Chinese provenance isn’t being used as a data collection scheme, and equally naïve to believe that said data isn’t being shared with the Chinese government.
 
A year ago, Congress passed a law mandating that ByteDance, the Chinese parent of TikTok, divest its ownership else be banned in the United States. ByteDance could be rich beyond all the dreams of avarice if it chose to sell. That it hasn’t done so simply reinforces everyone’s suspicions that the service’s real owner is primarily interested in something other than profits.
 
The bill that President Biden signed had passed the House 360-58 and the Senate 79-18. TikTok sued but the Supreme Court upheld the law in a unanimous ruling in January. It’s an astonishingly bipartisan issue in a deeply divided time. Yet in a mystifying turn of events, the current administration has twice extended the original divestment deadline (now set for June 19). “Perhaps I shouldn’t say this,” President Trump told NBC’s Kristen Welker, “but I have a little warm spot in my heart for TikTok.” Quite the switch for someone who rightly attempted to ban the service during his first term.
 
After the latest show of bad faith by Tik Tok revealed by Irish regulators, President Trump should now enforce this sale – after all, it is a law, not a suggestion – and protect our citizens. It is the president’s constitutional duty to carry out the laws the American people pass through the voice of their representatives. A show of seriousness about enforcing this law would probably allow TikTok to survive in some form. Moreover, it would protect tens of millions of Americans from Chinese government surveillance.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Gov. Youngkin Adds Guardrails to Roadside Cameras

5/10/2025

 
Picture
​Torn between the wishes of pro-surveillance law enforcement on one side and Fourth Amendment privacy defenders on the other, Virginia Gov. Glenn Youngkin (R) finally leaned toward the latter. Last week he signed legislation regulating and curbing the expansion of one of his state’s fastest growing niche industries – automated license plate readers.
 
This technology doesn’t just scan license plate numbers. It captures vehicle make, type, and color as well as features like stickers, bike racks, even noticeable dents. It can be used to track where we go and who we meet with, potentially compromising privacy, as well as our associational rights in politics and religion.
 
While not as robust a piece of legislation as might have been possible, it’s more than a step in the right direction. Here’s what the law does:

  • 671 cameras are enough: That’s how many Flock Safety brand cameras there are in 16 Virginia jurisdictions, with 172 in Norfolk alone. Critics of a watered-down version of the bill feared it would have allowed the addition of thousands more cameras. In 2026, Virginia’s General Assembly will have to debate the expansion issue all over again. But for now, the expansion has been stopped.

  • No more sharing: Without a warrant, subpoena, or court order, Flock Safety data gathered in Virginia now has to stay in Virginia. It can no longer be easily shared with out-of-state agencies.

  • Going public: Police agencies are now mandated to compile and publicly report key details on how the camera images and associated data are used. Not that we expect to see a brutally honest category called “Civil Rights Violations,” but remember that any public report is automatically subject to auditing (official or otherwise), so the watchdog value of such requirements is powerful.
​
  • Purge faster: Currently, surveillance data is stored for 30 days before being deleted (unless it’s being used as part of an active investigation). The new law slashes that by 30 percent, down to 21 days.

  • No off-label use: Collected data is now limited to specific criminal investigations as well as human trafficking, stolen vehicles, and missing persons cases. No more dragnet searches through citizens’ data without cause. This provision is a major step forward.

It should be noted that Gov. Youngkin tried to strike a compromise between the opposing camps, but none emerged. That’s a good thing for privacy rights. Even so, the law still has weaknesses. Chief among them is that the locations of Flock Safety cameras still do not have to be disclosed.  (Fortunately, social and traditional media help in that regard). And while 21 days of storage is certainly better than the original 30, we’d like to see that number come down to seven or less.
 
As for next year’s rematch of the “expand or not” battle, 2025 is the third year in a row that the Virginia Assembly has stymied, at least somewhat, Flock Safety’s and law enforcement’s desire to pursue mass surveillance unchecked.
 
Here’s hoping for four years of pushback in a row. We even have a slogan: “Four for the Fourth.” Okay, we don’t love it either. Feel free to send us your suggestions. Better yet, if you’re a Virginia resident, send it to your state delegate and senator.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR FIRST AMENDMENT RIGHTS

How Police Can Use Your Car to Spy on You

5/5/2025

 
Picture
​We reported in February that Texas Attorney General Ken Paxton is suing General Motors over its long-running, for-profit consumer data collection scheme it hatched together with insurance companies. Now Wired’s Dell Cameron reveals that automakers may be doing even more with your data, perhaps sharing it with law enforcement (often with and without a proper warrant).
 
So you may be getting way more than you bargained for when you subscribe to your new vehicle’s optional services. In effect, your vehicle is spying on you by reporting your location to cell towers. The more subscription services you sign up for, the more data they collect. And in some cases, reports Wired, cars are still connecting with cell towers even after buyers decline subscriptions.
 
All of that data can easily be passed to law enforcement. There are no set standards as to who gives what to whom and when. When authorities ask companies to share pinged driver data, the answers range from “Sure! Would you like fries with that?” to “Come back with a subpoena,” to “Get a warrant.” For its part, GM now requires a court order before police can access customers’ location data. But the buck can also be passed to the cell service providers, where the protocols are equally opaque. When Wired’s Cameron asked the various parties involved what their policies were, he was frequently met with the sound of crickets.
 
Author John Mac Ghlionn sums up the state of automotive privacy: “Your car, once a symbol of independence, could soon be ratting you out to the authorities and even your insurance company.”
 
It’s probably time to update “could soon be” to “is.”
 
This technology gives police the ability to cast a wide dragnet to scoop up massive amounts of personal data, with little interference from pesky constitutional checks like the Fourth Amendment. Law enforcement agencies of all stripes claim their own compelling rights to collect and search through such data dumps to find the one or two criminals they’re looking for, needle-like, in that haystack of innocent peoples’ information. Since your driving data can be sold to data brokers, it is also likely being purchased by the FBI, IRS, and a host of other federal agencies that buy and warrantlessly inspect consumer data.
 
Just over a year ago, Sens. Ed Markey (D-MA) and Ron Wyden (D-OR) fired off a letter to the chair of the FTC to demand more clarity about this dragnet approach. Caught with their hand in the cookie jar thanks to the resulting inquiry, GM agreed to a five-year hiatus on selling driver data to consumer reporting agencies. Where that leaves us with the police, as the Wired article reports, often remains an open question.
 
In the meantime, consider adjusting your car’s privacy settings and opt outs. The more drivers who take these actions, the more clearly automakers, service providers, and law enforcement agencies will start to get the message.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

A New Concern: Privately Funded License-Plate Readers in LA

5/4/2025

 
Picture
​We’ve covered automated license plate reader (ALPR) software nearly 20 times in the last few years. That we are doing so again is a reminder that this invasive technology continues to proliferate.
 
In the latest twist, an affluent LA community bought its own license-plate readers, gifted them to the Police Foundation; and, with approval from the City Council and the Police Commission, handed them to the LAPD. There was a proviso – that they only be used in said well-off LA community.
 
Turns out the LAPD didn’t appreciate being told where to use ALPR tech and which brand to use. The head of the department’s Information Technology Bureau told the media that law enforcement agencies should be able to use plate reader technology as they see fit and should own and control the data collected. This seems more about turf than principle, given that the LAPD already has thousands of plate-reading cameras in use.
 
This case brings a new question to an already intense debate. Should the well-connected be able to contract with local police to indiscriminately spy on masses of drivers, looking for those “who aren’t from around here”?
 
It is concerning enough the LAPD has already built up one of the nation’s largest ALPR networks. This is an example of how for-profit startups like Flock Safety are trying to corner the market for this technology nationwide and doing so through opaque agreements with law enforcement agencies that are impermeable to public scrutiny and oversight.
 
As with most surveillance tech, there are cases that justify their use. But these legitimate instances tend to be relatively few in number and should be executed with transparency in mind and oversight engaged. That’s a far cry from the “dragnet surveillance” approach currently in place, where the movements of millions of citizens who have done nothing wrong are tracked and stored in public and private databases for years at a time, all without a warrant or individual consent.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Biden Administration Kept “Disinformation” Dossiers on Americans

5/3/2025

 
Picture
The Biden administration’s State Department kept dossiers on Americans accused of acting as “vectors of disinformation.”

This was a side activity of the now-defunct State Department Global Engagement Center (GEC). It secretly funded a London-based NGO that pressured advertisers to adhere to a blacklist of conservative publications, including The American Spectator, Newsmax, the Federalist, the American Conservative, One America News, the Blaze, Daily Wire, RealClearPolitics, Reason, and The New York Post.

Now we know that the blacklisting went beyond publications to include prominent individuals. At least one of them, Secretary Rubio said, was a Trump official in the Cabinet room when the secretary made this announcement.
​
“The Department of State of the United States had set up an office to monitor the social media posts and commentary of American citizens, to identify them as ‘vectors of disinformation,’” Rubio said on Wednesday. “When we know that the best way to combat disinformation is freedom of speech and transparency.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Jordan and Biggs Are Right – Protect Americans’ Privacy by Terminating the US-UK CLOUD Act Agreement

5/2/2025

 
Picture
Rep. Jim Jordan (R-Ohio) and Rep. Andy Biggs (R-Arizona)
It looks like the CLOUD Act might soon evaporate.

A bilateral agreement under that Clarifying Lawful Overseas Use of Data Act went into effect in 2022 to facilitate the sharing of data for law enforcement purposes. In February, the news leaked that the UK’s Home Office had secretly ordered Apple to provide a backdoor to the content of all of its users, Americans included. The order would effectively break the Apple iPhone’s Advanced Data Protection service that uses end-to-end encryption to ensure that only the account user can access stored data.

In response, Rep. Jim Jordan, Chairman of the House Judiciary Committee, and Rep. Andy Biggs, Chairman of the Subcommittee on Crime and Federal Government Surveillance, have fired off a letter to Attorney General Pam Bondi asking her to terminate the agreement with the UK under the CLOUD Act.

They understand the UK order would be a privacy catastrophe for Apple users around the world. Encryption protects dissidents, women and children hiding from abusive relationships, not to mention the proprietary secrets of innumerable businesses and people who simply value their privacy.

Under the terms of the agreement, the two parties can renew the CLOUD Act every five years. Just after the 2024 election, however, then-Attorney General Merrick Garland preemptively renewed the agreement to try to discourage the incoming Trump Administration from canceling or changing the agreement.

These two leading House Republicans told Bondi that the UK order “exposes all Apple users, including American citizens, to unnecessary surveillance and could enable foreign adversaries and nefarious actors to infiltrate such a backdoor.”

Or, as Jordan and Biggs noted, President Trump told UK Prime Minister Keir Starmer that the order was like “something that you hear about with China.”

Perhaps fearing a consumer backlash in the United Kingdom, the British government made a bid to keep Apple’s appeal of the order in a secret court session, claiming that even discussing the “bare bones” of the case would harm national security. The Investigatory Powers Tribunal rejected the government’s stance, guaranteeing at least some openness in the court’s deliberations.

But we cannot count on the British government to get it right for Americans. For that reason, Chairmen Jordan and Biggs began heaving rhetorical chests of tea into the harbor. They wrote:

“Accordingly, because the UK’s order could expose U.S. citizens to surveillance and enable foreign adversaries and nefarious actors to gain access to encrypted data, we respectfully urge you to terminate the Agreement and renegotiate it to adequately protect American citizens from foreign government surveillance.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

How Facial Recognition Technology Criminalizes Peaceful Protest

4/29/2025

 
Picture
​Today, Hungary is ostensibly free, a democratic state in a union of democratic states. But something is rotten in Budapest. Prime Minister Viktor Orbán has been steadily fashioning a monoculture since his return to power 15 years ago, running afoul of European Union policies and democratic norms along the way. The most recent infraction is multifaceted, and it involves the use of facial recognition to target peaceful protesters for criminal prosecution.
 
In March, Orbán’s subservient parliament railroaded the opposition and banned public gatherings of gay rights activists. With the stroke of a pen, Pride gatherings and related pro-gay rights protests were suddenly illegal. A month later, these crackdowns were enshrined in the country’s constitution (showing why America’s founders were wise to foresee the necessity in making the U.S. Constitution so notoriously difficult to amend).
 
As in Putin’s Russia, the justification for this crackdown is that it’s necessary to protect children from “sexual propaganda” – even though we are talking about peaceful protests conducted by adults in city centers. However you feel about Pride parades, most Hungary watchers believe the prime minister needs to whip up a cultural scapegoat to rally his base in advance of next year’s elections.
 
Hungary represents a turning point in the rise of the modern surveillance state in a developed country. Beyond the infringement of basic rights, it includes a chilling new embrace of facial recognition technology – specifically, to identify Pride participants (now officially designated as criminals) or likewise pick out faces from among the tens of thousands who are sure to illegally protest these new measures. At the moment, the punishment for such unconstitutional behavior is a fine of up to €500. Organizers, however, can be imprisoned for up to a year. But can even more draconian punishments be far behind?
 
If you’re wondering how Hungary’s democratic partners in the European Union are reacting to all of this, the answer is not well. And it’s also raising important questions about the efficacy of the EU’s AI regulations in general (a debate about loopholes and guardrails that merits a separate discussion).
 
For now, though, Americans should take in a cautionary warning from Hungary’s use of facial recognition software. Future uses of the technology here could target leaders of a MAGA or a Black Lives Matter protest. Facial recognition scans can pinpoint individuals, spotting the face in a crowd. It gives regimes the ability to come back later to arrest and persecute on a scale only Orwell could have conceived. All of this is enhanced by the unholy combination of data analytics, advanced algorithms, unprecedented computing power, and now generative AI.
 
The uncomfortable truth of the modern era is inescapable: The development and deployment of modern surveillance has gone hand in hand with modern authoritarianism, from Russia to China and Iran. Just imagine what might have happened if J. Edgar Hoover had access to facial recognition tech and AI. We imagine it would have looked like Orbán’s dystopian democracy.
 
Budapest Pride is not backing down, celebrating its 30th anniversary in a public demonstration in June. The world will be watching to see how this technology is used.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

AI and Data Consolidation Is Supercharging Surveillance

4/28/2025

 
Picture
​In Star Wars lore, it was the democratic, peace-loving Republic that built the first fleet of Star Destroyers. But the fleet was quickly repurposed for evil after the Republic fell. What was once a defensive force for good became a heavy-handed tool of occupation and terror.
 
In a galaxy closer to home, imagine the development of a fully integrated civilian computer system designed to help a technological democracy of 345 million people operate smoothly. In the early 21st century, successive governments on both the right and left embraced the idea that “data is the new oil” and began the process of digitizing records and computerizing analog processes. Generative artificial intelligence, vast increases in computing power, and the rise of unregulated data brokers made the creation of a single database containing the personal information and history of every citizen readily available to federal agencies.
 
At first, the system worked as advertised and made life easier for everyone – streamlining tax filing, improving public service access, facilitating healthcare management, etc. But sufficient guardrails were never established, allowing the repurposing of the system into a powerful surveillance tool and mechanism of control.
 
This scenario is now on the brink of becoming historical fact rather than cinematic fiction.
 
“Data collected under the banner of care could be mined for evidence to justify placing someone under surveillance,” warns Indiana University’s Nicole Bennett in a recent editorial for The Conversation. And if you like your social critiques with a side of irony, the Justice Department agreed with her in its December 2024 Artificial Intelligence and Criminal Justice report. It concluded that the AI revolution represents a two-edged sword. While potentially a driver of valuable new tools, its use must be carefully governed.
 
The Justice Department said that AI data management must be “grounded in enduring values. Indeed, AI governance in this space must account for civil rights and civil liberties just as much as technical considerations such as data quality and data security.”
 
Yet the government is proceeding at breakneck speed to consolidate disparate databases and supercharge federal agencies with new and largely opaque AI tools, often acquired through proprietary corporate partnerships that currently operate outside the bounds of public scrutiny.
 
Anthony Kimery of Biometric Update has described the shift as a new “arms race” and fears that it augers “more than a technological transformation. It is a structural reconfiguration of power, where surveillance becomes ambient, discretion becomes algorithmic, and accountability becomes elusive.”
 
The Galactic Republic had the Force to help it eventually set things right. We have the Fourth – the Fourth Amendment, that is – and the rest of the Bill of Rights. But whether these analog bulwarks will hold in the digital age remains to be seen. To quote Kimery again, we are “a society on the brink of digital authoritarianism,” where “democratic values risk being redefined by the logic of surveillance.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

What the Leaking of 21 Million Employee Screenshots Tells Us About the Threat of Worker Surveillance Apps

4/28/2025

 
Picture
​In the late 19th century, American business embraced the management philosophy of Frederick Winslow Taylor, author of The Principles of Scientific Management. He wrote: “In the past the man has been first; in the future the system must be first.”

So managers put their factory systems first by standardizing processes and performing time and motion studies with a stopwatch to measure the efficiency of workers’ every action. Nineteenth century workers, who were never first, became last.

Now intrusive surveillance technology is bringing this management philosophy to the knowledge economy. This entails not just the application of reductionism to information work, but the gross violation of employee privacy.

This was brought home when Paulina Okunyté of Cybernews reported on Thursday that WorkComposer, an employee surveillance app that measures productivity by tracking logging activity and regular screenshots of employees, left over 21 million images exposed in an unsecured bucket in Amazon’s cloud service.

WorkComposer also logs keystrokes and how much time an employee spends on an app. As a result, usernames and passwords that are visible in screenshots might enable the hijacking of accounts and breaches of businesses around the world.

“Emails, documents, and projects meant for internal eyes only are now fair game for anyone with an internet connection,” Okunyté writes.

With 21 million images to work with, there is plenty of material for cyberthieves and phishing scammers to victimize the people who work for companies that use WorkComposer software.

This incident exposes the blinkered philosophy behind employee surveillance. As we have reported, there are measurable psychological costs – and likely productivity costs – when people know that they are being constantly watched. Vanessa Taylor of Gizmodo reports that according to a 2023 study by the American Psychological Association, 56 percent of digitally surveilled workers feel tense or stressed at work compared to 40 percent of those who are not.

We also question the usefulness of such pervasive tracking and surveillance. Efficiency is a commendable goal. Surely there are broader and less intrusive ways to measure employee productivity. Such close monitoring runs the risk of focusing workers on meeting the metrics instead of bringing creativity or bursts of productivity to their jobs. Allowing people to take a break every hour to listen to a song on earbuds might, in the long run, make for better results and greater efficiency. 
​
Just don’t make a funny face or sing along, the whole world might see you.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Warrants and the “Wild West” of Digital Surveillance

4/21/2025

 

Rep. Knott: “It’s Amazing to Me That There’s So Much Resistance to the Warrant Requirement”

Picture
​Perhaps you had other things to do during last week’s House Judiciary hearing, “A Continued Pattern of Government Surveillance of U.S. Citizens.” So here’s a summary: The Judiciary’s Subcommittee on Crime and Federal Government Surveillance brought together witnesses from across the political spectrum (including PPSA’s own Gene Schaerr) to identify potential solutions to the ongoing (and growing) problem of Fourth Amendment abuse by government entities.
 
At the heart of the discussion was the need to import probable cause warrants – the key requirement of the Constitution’s Fourth Amendment – to the practice of federal agencies freely accessing our international communications, as well as our personal, digital data.
 
Witnesses effectively rebutted the fearmongering campaign by the intelligence community to convince us that a warrant requirement for federal surveillance of American citizens is too onerous, and too dangerous to entertain. But the most effective remarks came from a Member of the committee.
 
Rep. Brad Knott (R-NC), a former U.S. Attorney for the Eastern District of North Carolina, addressed the issue of warrant requirements with the assurance of a former federal prosecutor. He spoke of what it took for him to get permission to “flip the switch” on some of the most “intrusive” forms of wiretapping American citizens.
 
“So you have to demonstrate necessity,” Rep. Knott said. “You have to demonstrate why other techniques are futile … the rigor we had to exercise was very important … it kept the internal investigators accountable.”
 
Rep. Knott said the warrant process made sure investigations were “open and honest.” Investigators knew “that their actions were going to be subject to pen and paper. They were going to be subject to judicial review … and opposing counsel.”
 
Given the clarity and accountability added by warrants, Rep. Knott added:
 
“It’s amazing to me that there’s so much resistance to the warrant requirement alone.”
 
Throughout the 90-minute hearing, Members and witnesses stressed one thing:
 
The countdown clock is ticking on what may be our last, best chance at meaningful reform – including the adoption of a warrant requirement for U.S. citizens when Section 702 of the Foreign Intelligence Surveillance Act (FISA) comes up for renewal next year (it’s due to sunset in April 2026).
 
Section 702 is the legal authority that allows federal intelligence agencies to spy on foreign targets on foreign soil. But it also “incidentally” picks up the international communications of Americans, which can then be warrantlessly inspected by the FBI and other agencies.
 
Section 702 got a lot of airtime at the hearing and was frequently linked with the words “loophole” and “backdoor.” The Reforming Intelligence and Securing America Act (RISAA) of 2024 attempted to fix Section 702 – and did add some useful reforms – but it also left a loophole in which the FBI and others attempt to justify warrantless backdoor searches on Americans’ private communications.
 
For the FBI in particular, this has become the go-to means to warrantlessly develop domestic leads.
 
“Three million times they did [backdoor searches] in 2021,” lamented Judiciary Chairman Jim Jordan (R-OH). Or, as James Czerniawski of Americans for Progress, put it: “Time and time again we have caught the intelligence community with their hand in the constitutional cookie jar.”
 
Members and witnesses alike also addressed a privacy crisis even greater than Section 702 – the routine purchases made by federal agencies of Americans’ private digital information from data brokers.
 
ACLU’s Kia Hamadanchy reminded the subcommittee that the kind of data that can be bought and sold would be, in the words of a former CIA deputy director, “top secret” sensitive if gathered by traditional intelligence means. It would have to be kept “in a safe,” not in a database.
 
The hearing also got at what many consider the underlying issue driving the new era of surveillance. Namely, the acknowledgment that we increasingly live not in one world, but two – our physical reality and its digital twin. But unlike our world, the laws governing how the Fourth Amendment should be applied in the digital context are largely unwritten. In other words, said Rep. Andy Biggs (R-AZ), it’s the “Wild West.”
 
And Ranking Member Rep. Jamie Raskin (D-MD) added, “New technologies make it a lot harder to reign in government intrusion in the lives of the people.” The unwitting result? “We live in a modern, albeit consensual, surveillance state,” declared Phil Kiko, principal at Williams & Jensen and former Judiciary counsel.
 
With any luck, things might be different a year from now when FISA is up for renewal, thanks to a U.S. District Court ruling in January.
 
“To countenance this practice,” of warrantless surveillance, wrote the court, “would convert Section 702 into … a tool for law enforcement to run ‘backdoor searches’ that circumvent the Fourth Amendment.”
 
That legal precedent didn’t exist when the last Congress debated FISA reforms. Emboldened by this landmark decision, Reps. Jordan and Raskin are pledging to once again work together in a bipartisan spirit to win this fight. Their continuing partnership captures the spirit of the subcommittee’s hearing and should give reformers a renewed sense of hope.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS
<<Previous

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Digital Privacy
    Domestic Surveillance
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2024. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank