Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • Section 702 Reform
    • PRESS Act
    • DONATE

 NEWS & UPDATES

Is Your AI Therapist a Mole for the Surveillance State?

5/16/2025

 

“It’s Delusional Not to be Paranoid”

Picture
​With few exceptions, conversations with mental health professionals are protected as privileged (and therefore private) communication.
 
Unless your therapist is a chatbot. In that case, conversations are no more sacrosanct than a web search or any other AI chat log; with a warrant, law enforcement can access them for specific investigations. And of course, agencies like the NSA don’t even feel compelled to bother with the warrant part.
 
And if you think you’re protected by encryption, think again says Adi Robertson in The Verge. Chatting with friends using encrypted apps is one thing. Chatting with an AI on a major platform doesn’t protect you from algorithms that are designed to alert the company to sensitive topics.
 
In the current age of endless fascination with AI, asks Robertson, what would prevent any government agency from redefining what constitutes “sensitive” based on politics alone? Broach the wrong topics with your chatbot therapist and you might discover that someone has leaked your conversation to social media for public shaming. Or perhaps a 4 a.m. knock on the door with a battering ram by the FBI.
 
Chatbots aren’t truly private any more than email is. Recall the conventional wisdom from the 1990s that advised people to think of electronic communication as the equivalent of a postcard. If you wouldn’t want to write something on a postcard for fear of it being discovered, then it shouldn’t go in an email – or in this case, a chat. We would all do well to heed Adi Robertson’s admonition that when it comes to privacy, we have an alarming level of learned helplessness.
 
“The private and personal nature of chatbots makes them a massive, emerging privacy threat … At a certain point, it’s delusional not to be paranoid.”
 
But there’s another key difference between AI therapists and carbon-based ones: AI therapists aren’t real. They are merely a way for profit-driven companies to learn more about us. Yes, Virginia, they’re in it for the money. To quote Zuckerberg himself, “As the personalization loop kicks in and the AI starts to get to know you better and better, that will just be really compelling.” And anyone who thinks compelling isn’t code for profitable in that sentence should consider getting a therapist.
 
A real one.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Montana Leads the Way with Two New Data Privacy Bills

5/15/2025

 
Picture
​If you want to see what leadership looks like when it comes to protecting data privacy, head to Big Sky Country. Montana Gov. Greg Gianforte just signed a bill limiting the state’s use of personal electronic data. That makes Montana the first state to pass a version of the federal bill known as the Fourth Amendment Is Not for Sale Act.
 
The chief provisions of the new Montana law include:
​
  • Government entities are prohibited from purchasing personal data without a warrant or subpoena issued by a court.

  • Authorities may not access data from personal electronic devices unless the owner consents, a court agrees that there is probable cause, or the situation is a legitimate emergency.

  • Courts must hold as inadmissible improperly obtained personal data.

  • Service providers cannot be forced to disclose their customers’ personal data unless a court has granted permission.

There must be something in Montana’s clean, libertarian air these days, because the governor is expected to sign another pro-privacy bill soon. That bill bolsters the state’s existing consumer data privacy act, the Montana Consumer Data Privacy Act (MTCDPA), in several ways:

  • Obvious (and straightforward) methods must be available for consumers to choose if they want their personal data sold or used for targeted advertising.

  • Greatly increasing the number of organizations that are subject to the MTCDPA.

  • The state’s Attorney General can now quickly respond to privacy act violators. No more 60-day waiting period.
    ​
  • The new law makes transparency more transparent. For example, privacy notices have to be clearly hyperlinked from websites or within apps.

We hear Montana is beautiful this time of year. If you go, take a moment to appreciate that your data is safer there than anywhere else in the country. Let’s hope that what happened in Montana last week will inspire federal lawmakers to follow suit and pass the Fourth Amendment Is Not for Sale Act.

    STAY UP TO DATE

Subscribe to Newsletter
DoNATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Ipsos Poll: Americans Agree that Government Collects Too Much of Our Data

5/15/2025

 
Picture
​Privacy is a big theme around here (it’s even in our name), so we are alert to reputable polling on the subject. The latest such poll from Ipsos confirms that privacy is a concern for Americans of all political stripes.
 
Among the survey’s key findings? When shown the statement, “The government collects too much data about me,” majorities in every group agreed:
 
  • Republicans: 61%
  • Democrats: 70%
  • Independents: 68%
 
That question covers collecting our data. When Ipsos asked if it is acceptable for government agencies to share that data with private companies, landslide majorities declared it was not okay:
 
  • Republicans: 85%
  • Democrats: 89%
  • Independents: 94%
 
For good measure, the pollsters flipped that question and asked if it is acceptable for private companies to share our data with government agencies? Ipsos found nearly identical levels of scorn. This is not a theoretical concern – a dozen agencies, ranging from the IRS to the FBI, Department of Homeland Security, and the Pentagon, routinely purchase and access Americans’ personal data from third-party data brokers. Americans are coming to appreciate this, which is why a YouGov poll showed that four out of five Americans favor a warrant requirement before federal agencies can inspect our private information.
                                                                                            
But the real gem of the Ipsos survey is to be found in a follow up question. Having established that Americans care about their data, respondents were asked to name which types of data come to mind that they consider the most personal. Out of 12 possible categories, the top four of the greatest concern are:
 
  • Financial: 60%
  • Health: 37%
  • Credit card usage: 32%
  • Biometric identifiers: 32%
 
If there’s a surprise to be found anywhere in the results, it may be in this crosstab nugget: Americans ages 18-34 ranked biometrics and location-based data even higher than health information. It seems the digital generation is keenly aware of how they’re being tracked by their faces, irises, and fingerprints. It may be the case that older Americans tend to think of data and privacy in terms of records, whereas the youngest know it’s about much more than that.
 
The Ipsos findings are probably no surprise to the private companies and government agencies that covet our personal data. But we hope Members of Congress will pay attention to these results and respond to the passionate concerns of their constituents.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Watching the Watchers: Why Rep. Luna Wants to Repeal the Patriot Act

5/13/2025

 
Picture
U.S. Congresswoman-elect Anna Paulina Luna speaking with attendees at the 2022 AmericaFest at the Phoenix Convention Center in Phoenix, Arizona. Photo credit: Greg Skidmore
Rep. Anna Paulina Luna (R-FL) recently introduced the American Privacy Restoration Act, which would fully repeal the USA Patriot Act, the surveillance law hurriedly passed in 2001 shortly after the 9/11 attacks. Rep. Luna declared:
 
“For over two decades, rogue actors within our U.S. intelligence agencies have used the Patriot Act to create the most sophisticated, unaccountable surveillance apparatus in the Western world. My legislation will strip the deep state of these tools and protect every American’s Fourth Amendment right against unreasonable searches and seizures. It’s past time to rein in our intelligence agencies and restore the right to privacy. Anyone trying to convince you otherwise is using ‘security’ as an excuse to erode your freedom.”
 
What is so wrong about the Patriot Act? Judge Andrew Napolitano spells it out in a recent piece in The Washington Times. Judge Napolitano writes:
 
“Among the lesser-known holes in the Constitution cut by the Patriot Act in 2001 was the destruction of the ‘wall’ between federal law enforcement and federal spies. The wall was erected in the Federal Intelligence Surveillance Act of 1978, which statutorily limited all federal domestic spying to that which the Foreign Intelligence Surveillance Court authorized.

“The wall was intended to prevent law enforcement from accessing and using data gathered by America’s domestic spying agencies …
 
“In the last year of the Biden Administration, the FBI admitted that during the first Trump Administration, it intentionally used the CIA and the National Security Agency to spy on Americans about whom the FBI was interested but as to whom it had neither probable cause of crime nor even articulable suspicion of criminal behavior …”
 
Even if Rep. Luna’s bill to repeal the Patriot Act does not pass, reform is still possible. Judge Napolitano writes:
 
“With a phone call, President Trump, who was personally victimized by this domestic spying 10 years ago, can stop all domestic spying without search warrants. He can re-erect the wall between spying and law enforcement.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Meta’s AI Chatbot a New Step Toward a Surveillance Society

5/13/2025

 
Picture
​We’re not surprised – and we are sure you are not either – to learn that new tech rollouts from Meta and other Big Tech companies voraciously consume our personal data. This is especially true with new services that rely on artificial intelligence. Unlike traditional software programs, AI requires data – lots and lots of our personal data – to continuously learn and improve.
 
If the use of your data bothers you – and it should – then it’s time to wise up and opt out to the extent possible. Of course, opting out is becoming increasingly difficult to do now that Meta has launched its own AI chatbot to accompany its third-generation smart glasses. Based on reporting from Gizmodo and the Washington Post, here’s what we know so far:

  • Users no longer have the ability to keep voice recordings from being stored on Meta’s servers, where they “may be used to improve AI.”
  • If you don’t want something stored and used by Meta, you have to manually delete it.
  • Undeleted recordings are kept by Meta for one year before expiring.
  • The smartglasses camera is always on unless you manually disable the “Hey Meta” feature.
  • If you somehow manage to save photos and videos captured by your smartglasses only on your phone’s camera roll, then those won’t be uploaded and used for training.
  • By default, Meta’s AI app remembers and stores everything you say in a “Memory” file, so that it can learn more about you (and feed the AI algorithms). Theoretically, the file can be located and deleted. No wonder Meta’s AI Terms of Service says, “Do not share information that you don’t want the AIs to use and retain such as account identifiers, passwords, financial information, or other sensitive information.”
  • Bonus tip: if you happen to know that someone is an Illinois or Texas resident, by using Meta’s products you’ve already implicitly agreed not to upload their image (unless you’re legally authorized to do so).

None of the tech giants is guiltless when it comes to data privacy, but Meta is increasingly the pioneer of privacy compromise. Culture and technology writer John Mac Ghlionn is concerned that Zuckerberg’s new products and policies presage a world of automatic and thoroughgoing surveillance, where we will be constantly spied on by being surrounded by people wearing VR glasses with cameras.
 
Mac Ghlionn writes:
​
“These glasses are not just watching the world. They are interpreting, filtering and rewriting it with the full force of Meta’s algorithms behind the lens. And if you think you’re safe just because you’re not wearing a pair, think again, because the people who wear them will inevitably point them in your direction.
“You will be captured, analyzed and logged, whether you like it or not.”
 
But in the end, unlike illicit government surveillance, most commercial sector incursions on our personal privacy are voluntary by nature. VR glasses have the potential to upend that equation.
 
Online, we can still to some degree reduce our privacy exposure in what we agree to, even if it means parsing those long, hard to understand Terms of Service. It is still your choice what to click on. So, as the Grail Knight told Indiana Jones in The Last Crusade, “Choose wisely.”
 
You should also learn to recognize Meta’s Ray-Bans and their spy eyes.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

It’s Time to Enforce the TikTok Ban

5/12/2025

 
Picture
​Ireland’s Data Protection Commission, acting in its official capacity as an EU privacy guardian, recently fined TikTok $600 million (€530 million) for breaching its data privacy rules. This punishment was meted out after the conclusion of a four-year investigation, so it’s a decision that was not made lightly.
 
None of this surprises us. We have previously reported on the surveillance issues related to TikTok as well as other Chinese-owned concerns. It’s naïve to think that any software of Chinese provenance isn’t being used as a data collection scheme, and equally naïve to believe that said data isn’t being shared with the Chinese government.
 
A year ago, Congress passed a law mandating that ByteDance, the Chinese parent of TikTok, divest its ownership else be banned in the United States. ByteDance could be rich beyond all the dreams of avarice if it chose to sell. That it hasn’t done so simply reinforces everyone’s suspicions that the service’s real owner is primarily interested in something other than profits.
 
The bill that President Biden signed had passed the House 360-58 and the Senate 79-18. TikTok sued but the Supreme Court upheld the law in a unanimous ruling in January. It’s an astonishingly bipartisan issue in a deeply divided time. Yet in a mystifying turn of events, the current administration has twice extended the original divestment deadline (now set for June 19). “Perhaps I shouldn’t say this,” President Trump told NBC’s Kristen Welker, “but I have a little warm spot in my heart for TikTok.” Quite the switch for someone who rightly attempted to ban the service during his first term.
 
After the latest show of bad faith by Tik Tok revealed by Irish regulators, President Trump should now enforce this sale – after all, it is a law, not a suggestion – and protect our citizens. It is the president’s constitutional duty to carry out the laws the American people pass through the voice of their representatives. A show of seriousness about enforcing this law would probably allow TikTok to survive in some form. Moreover, it would protect tens of millions of Americans from Chinese government surveillance.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Gov. Youngkin Adds Guardrails to Roadside Cameras

5/10/2025

 
Picture
​Torn between the wishes of pro-surveillance law enforcement on one side and Fourth Amendment privacy defenders on the other, Virginia Gov. Glenn Youngkin (R) finally leaned toward the latter. Last week he signed legislation regulating and curbing the expansion of one of his state’s fastest growing niche industries – automated license plate readers.
 
This technology doesn’t just scan license plate numbers. It captures vehicle make, type, and color as well as features like stickers, bike racks, even noticeable dents. It can be used to track where we go and who we meet with, potentially compromising privacy, as well as our associational rights in politics and religion.
 
While not as robust a piece of legislation as might have been possible, it’s more than a step in the right direction. Here’s what the law does:

  • 671 cameras are enough: That’s how many Flock Safety brand cameras there are in 16 Virginia jurisdictions, with 172 in Norfolk alone. Critics of a watered-down version of the bill feared it would have allowed the addition of thousands more cameras. In 2026, Virginia’s General Assembly will have to debate the expansion issue all over again. But for now, the expansion has been stopped.

  • No more sharing: Without a warrant, subpoena, or court order, Flock Safety data gathered in Virginia now has to stay in Virginia. It can no longer be easily shared with out-of-state agencies.

  • Going public: Police agencies are now mandated to compile and publicly report key details on how the camera images and associated data are used. Not that we expect to see a brutally honest category called “Civil Rights Violations,” but remember that any public report is automatically subject to auditing (official or otherwise), so the watchdog value of such requirements is powerful.
​
  • Purge faster: Currently, surveillance data is stored for 30 days before being deleted (unless it’s being used as part of an active investigation). The new law slashes that by 30 percent, down to 21 days.

  • No off-label use: Collected data is now limited to specific criminal investigations as well as human trafficking, stolen vehicles, and missing persons cases. No more dragnet searches through citizens’ data without cause. This provision is a major step forward.

It should be noted that Gov. Youngkin tried to strike a compromise between the opposing camps, but none emerged. That’s a good thing for privacy rights. Even so, the law still has weaknesses. Chief among them is that the locations of Flock Safety cameras still do not have to be disclosed.  (Fortunately, social and traditional media help in that regard). And while 21 days of storage is certainly better than the original 30, we’d like to see that number come down to seven or less.
 
As for next year’s rematch of the “expand or not” battle, 2025 is the third year in a row that the Virginia Assembly has stymied, at least somewhat, Flock Safety’s and law enforcement’s desire to pursue mass surveillance unchecked.
 
Here’s hoping for four years of pushback in a row. We even have a slogan: “Four for the Fourth.” Okay, we don’t love it either. Feel free to send us your suggestions. Better yet, if you’re a Virginia resident, send it to your state delegate and senator.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR FIRST AMENDMENT RIGHTS

PPSA Brief to SCOTUS: Clarify What Third-Party Disclosure Means in the Modern Era

5/8/2025

 
Picture
​The U.S. First Circuit Court of Appeals in 2024 held that the IRS did not violate the Fourth Amendment when it scooped up the financial records of one James Harper through a broad dragnet of the Coinbase cryptocurrency exchange. The court based this finding on a sweeping interpretation of the “third-party doctrine,” which “stems from the notion that an individual has a reduced expectation of privacy in information knowingly shared with another.”
 
Given the terabytes of personal data that technology forces us to hand over to third-party companies, including our most intimate data – personal communications, online searches, health issues, and yes, financial holdings – does this mean that, as the First Circuit and other lower courts have ruled, there is essentially “no legitimate expectation of privacy” in that data?
 
Consider that the U.S. Supreme Court has repeatedly held that the Fourth Amendment protects “that degree of privacy against government that existed when [it] was adopted.” Times change and technology evolves. Any inquiry into reasonableness should require a periodic recontextualizing of what the Founders intended. That’s not anti-originalism; it’s just a common-sense application of original intent with new technology and capabilities.
 
The Supreme Court did just that in Carpenter v. United States, holding that the warrantless seizure of cell phone records constitutes a Fourth Amendment violation. In this case, at least, the high Court held that a reasonable expectation of privacy exists even when information is held by a third party.
 
As the Court wrote, “when an individual ‘seeks to preserve something as private,’ and his expectation of privacy is ‘one that society is prepared to recognize as reasonable,’ official intrusion into that sphere generally qualifies as a search and requires a warrant supported by probable cause.”
 
That goes not only for cell phone records but for any data that is supposed to be private.
 
In our brief that PPSA filed with the Court, we explain that:
 
“Despite Carpenter’s clear warning against allowing the third-party doctrine to degrade privacy via a ‘mechanical interpretation of the Fourth Amendment’ … lower courts have generally failed to heed that warning. Rather, they mechanically first ask if the information was disclosed to a third party and then treat this disclosure as a complete carveout from Fourth Amendment protections unless the circumstances closely or identically match Carpenter’s narrow facts.”
 
In this era of breakneck technological change and cloud computing, much of our personal information is disclosed to third parties – even information of the most sensitive kind. An interpretation that third-party disclosure automatically nullifies your right to privacy is a flawed approach in the 21st century. 
 
As we demonstrated in our brief, the Supreme Court must act to “prevent a contrary understanding of Carpenter from continuing to erode Americans’ privacy as third-party storage becomes ubiquitous and artificial intelligence becomes powerful enough to piece together intimate information from seemingly innocuous details about a target’s life.”  
 
Technology is evolving too robustly and too rapidly for the third-party doctrine to remain stuck in the era of paper bills. The First Circuit’s extreme interpretation of the third-party doctrine is a quaint vestige of a prior age, no longer equal to technologies that the Supreme Court ruled contain all “the privacies of life,” and it would make the Fourth Amendment a mere piece of ink on parchment rather than a true safeguard of Founding-era levels of privacy.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

How Police Can Use Your Car to Spy on You

5/5/2025

 
Picture
​We reported in February that Texas Attorney General Ken Paxton is suing General Motors over its long-running, for-profit consumer data collection scheme it hatched together with insurance companies. Now Wired’s Dell Cameron reveals that automakers may be doing even more with your data, perhaps sharing it with law enforcement (often with and without a proper warrant).
 
So you may be getting way more than you bargained for when you subscribe to your new vehicle’s optional services. In effect, your vehicle is spying on you by reporting your location to cell towers. The more subscription services you sign up for, the more data they collect. And in some cases, reports Wired, cars are still connecting with cell towers even after buyers decline subscriptions.
 
All of that data can easily be passed to law enforcement. There are no set standards as to who gives what to whom and when. When authorities ask companies to share pinged driver data, the answers range from “Sure! Would you like fries with that?” to “Come back with a subpoena,” to “Get a warrant.” For its part, GM now requires a court order before police can access customers’ location data. But the buck can also be passed to the cell service providers, where the protocols are equally opaque. When Wired’s Cameron asked the various parties involved what their policies were, he was frequently met with the sound of crickets.
 
Author John Mac Ghlionn sums up the state of automotive privacy: “Your car, once a symbol of independence, could soon be ratting you out to the authorities and even your insurance company.”
 
It’s probably time to update “could soon be” to “is.”
 
This technology gives police the ability to cast a wide dragnet to scoop up massive amounts of personal data, with little interference from pesky constitutional checks like the Fourth Amendment. Law enforcement agencies of all stripes claim their own compelling rights to collect and search through such data dumps to find the one or two criminals they’re looking for, needle-like, in that haystack of innocent peoples’ information. Since your driving data can be sold to data brokers, it is also likely being purchased by the FBI, IRS, and a host of other federal agencies that buy and warrantlessly inspect consumer data.
 
Just over a year ago, Sens. Ed Markey (D-MA) and Ron Wyden (D-OR) fired off a letter to the chair of the FTC to demand more clarity about this dragnet approach. Caught with their hand in the cookie jar thanks to the resulting inquiry, GM agreed to a five-year hiatus on selling driver data to consumer reporting agencies. Where that leaves us with the police, as the Wired article reports, often remains an open question.
 
In the meantime, consider adjusting your car’s privacy settings and opt outs. The more drivers who take these actions, the more clearly automakers, service providers, and law enforcement agencies will start to get the message.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

A New Concern: Privately Funded License-Plate Readers in LA

5/4/2025

 
Picture
​We’ve covered automated license plate reader (ALPR) software nearly 20 times in the last few years. That we are doing so again is a reminder that this invasive technology continues to proliferate.
 
In the latest twist, an affluent LA community bought its own license-plate readers, gifted them to the Police Foundation; and, with approval from the City Council and the Police Commission, handed them to the LAPD. There was a proviso – that they only be used in said well-off LA community.
 
Turns out the LAPD didn’t appreciate being told where to use ALPR tech and which brand to use. The head of the department’s Information Technology Bureau told the media that law enforcement agencies should be able to use plate reader technology as they see fit and should own and control the data collected. This seems more about turf than principle, given that the LAPD already has thousands of plate-reading cameras in use.
 
This case brings a new question to an already intense debate. Should the well-connected be able to contract with local police to indiscriminately spy on masses of drivers, looking for those “who aren’t from around here”?
 
It is concerning enough the LAPD has already built up one of the nation’s largest ALPR networks. This is an example of how for-profit startups like Flock Safety are trying to corner the market for this technology nationwide and doing so through opaque agreements with law enforcement agencies that are impermeable to public scrutiny and oversight.
 
As with most surveillance tech, there are cases that justify their use. But these legitimate instances tend to be relatively few in number and should be executed with transparency in mind and oversight engaged. That’s a far cry from the “dragnet surveillance” approach currently in place, where the movements of millions of citizens who have done nothing wrong are tracked and stored in public and private databases for years at a time, all without a warrant or individual consent.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Biden Administration Kept “Disinformation” Dossiers on Americans

5/3/2025

 
Picture
The Biden administration’s State Department kept dossiers on Americans accused of acting as “vectors of disinformation.”

This was a side activity of the now-defunct State Department Global Engagement Center (GEC). It secretly funded a London-based NGO that pressured advertisers to adhere to a blacklist of conservative publications, including The American Spectator, Newsmax, the Federalist, the American Conservative, One America News, the Blaze, Daily Wire, RealClearPolitics, Reason, and The New York Post.

Now we know that the blacklisting went beyond publications to include prominent individuals. At least one of them, Secretary Rubio said, was a Trump official in the Cabinet room when the secretary made this announcement.
​
“The Department of State of the United States had set up an office to monitor the social media posts and commentary of American citizens, to identify them as ‘vectors of disinformation,’” Rubio said on Wednesday. “When we know that the best way to combat disinformation is freedom of speech and transparency.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Jordan and Biggs Are Right – Protect Americans’ Privacy by Terminating the US-UK CLOUD Act Agreement

5/2/2025

 
Picture
Rep. Jim Jordan (R-Ohio) and Rep. Andy Biggs (R-Arizona)
It looks like the CLOUD Act might soon evaporate.

A bilateral agreement under that Clarifying Lawful Overseas Use of Data Act went into effect in 2022 to facilitate the sharing of data for law enforcement purposes. In February, the news leaked that the UK’s Home Office had secretly ordered Apple to provide a backdoor to the content of all of its users, Americans included. The order would effectively break the Apple iPhone’s Advanced Data Protection service that uses end-to-end encryption to ensure that only the account user can access stored data.

In response, Rep. Jim Jordan, Chairman of the House Judiciary Committee, and Rep. Andy Biggs, Chairman of the Subcommittee on Crime and Federal Government Surveillance, have fired off a letter to Attorney General Pam Bondi asking her to terminate the agreement with the UK under the CLOUD Act.

They understand the UK order would be a privacy catastrophe for Apple users around the world. Encryption protects dissidents, women and children hiding from abusive relationships, not to mention the proprietary secrets of innumerable businesses and people who simply value their privacy.

Under the terms of the agreement, the two parties can renew the CLOUD Act every five years. Just after the 2024 election, however, then-Attorney General Merrick Garland preemptively renewed the agreement to try to discourage the incoming Trump Administration from canceling or changing the agreement.

These two leading House Republicans told Bondi that the UK order “exposes all Apple users, including American citizens, to unnecessary surveillance and could enable foreign adversaries and nefarious actors to infiltrate such a backdoor.”

Or, as Jordan and Biggs noted, President Trump told UK Prime Minister Keir Starmer that the order was like “something that you hear about with China.”

Perhaps fearing a consumer backlash in the United Kingdom, the British government made a bid to keep Apple’s appeal of the order in a secret court session, claiming that even discussing the “bare bones” of the case would harm national security. The Investigatory Powers Tribunal rejected the government’s stance, guaranteeing at least some openness in the court’s deliberations.

But we cannot count on the British government to get it right for Americans. For that reason, Chairmen Jordan and Biggs began heaving rhetorical chests of tea into the harbor. They wrote:

“Accordingly, because the UK’s order could expose U.S. citizens to surveillance and enable foreign adversaries and nefarious actors to gain access to encrypted data, we respectfully urge you to terminate the Agreement and renegotiate it to adequately protect American citizens from foreign government surveillance.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

That’s No Hydrangea, It’s a Camera!

5/1/2025

 

Time to Wise Up to High Tech Burglary​

Picture
​It might be time – and we can’t believe we’re typing this – to check your potted plants and hedges. If you don’t recognize that oddly shaped topiary in between the rhododendron and the geranium, it could be, well, a plant (as in a device placed there to spy on you).
 
As we reported before, a new trend is blooming in larceny: burglars hiding cameras on properties in order to learn the habits of residents. Take a look at this recent report from KABC in Los Angeles.
 
Similar instances have been linked to visitors from South America and hence are referred to as “burglary tourism.” But in reality, it’s just as much a home-grown problem. (No more gardening puns, we promise.) In the end, the source of the violation is irrelevant. What matters is that we’re dealing with some relatively sophisticated criminals.
 
And what matters more is how to protect yourself. Here’s some advice:
 
  1. A rose by any other name: Leaves, grass, rocks, flowers – all of these have been used as disguises for hidden spy cams. Fortunately, on close inspection they generally reveal themselves as the clumsy fakes they are. The intent behind them is to blend into your peripheral vision, not to fool a botanist. So, plan a morning coffee date this weekend. Just you and the shrubs and a little fresh air.
 
  1. A little night music. Switch to decaf (or pour your nightcap into a shatterproof tumbler), turn out the lights, and have a look around your property. In the KABC segment above, intended victim George Nguyen noticed a flashing light while watering in the evening. We told you these aren’t necessarily sophisticated schemes. While not all hidden cameras come with victim-friendly giveaways like lights, a significant number do. By the way, if you do a night-time walk-around, please notify your neighbors ahead of time so they don’t call the cops. Bonus tips: Avoid dark clothing and ski masks.
 
  1. Like a good neighbor: Speaking of neighbors, don’t assume the camera is on your property. The best views of your place may be across the street or courtyard. Of course, be sure to explain yourself before looking in the hedges next door, especially if you live in Texas. The more you and your neighbors can work together on this, the better.
 
  1. Signed, sealed, and re-routed: If you’re out of town a lot, and your dropped-off packages linger, re-think your delivery strategy. Why make it obvious that no one’s home? Re-route deliveries to locker/pickup locations or to a trusted neighbor who’s always home.
 
CNET offers some additional guidance on how to thwart this high-tech thievery, including installing a video doorbell or a camera with audio that let’s you see (and ask annoying questions) in real time.
 
Finally, if you do discover a hidden camera spying on you or your neighbors, call the police.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

How Facial Recognition Technology Criminalizes Peaceful Protest

4/29/2025

 
Picture
​Today, Hungary is ostensibly free, a democratic state in a union of democratic states. But something is rotten in Budapest. Prime Minister Viktor Orbán has been steadily fashioning a monoculture since his return to power 15 years ago, running afoul of European Union policies and democratic norms along the way. The most recent infraction is multifaceted, and it involves the use of facial recognition to target peaceful protesters for criminal prosecution.
 
In March, Orbán’s subservient parliament railroaded the opposition and banned public gatherings of gay rights activists. With the stroke of a pen, Pride gatherings and related pro-gay rights protests were suddenly illegal. A month later, these crackdowns were enshrined in the country’s constitution (showing why America’s founders were wise to foresee the necessity in making the U.S. Constitution so notoriously difficult to amend).
 
As in Putin’s Russia, the justification for this crackdown is that it’s necessary to protect children from “sexual propaganda” – even though we are talking about peaceful protests conducted by adults in city centers. However you feel about Pride parades, most Hungary watchers believe the prime minister needs to whip up a cultural scapegoat to rally his base in advance of next year’s elections.
 
Hungary represents a turning point in the rise of the modern surveillance state in a developed country. Beyond the infringement of basic rights, it includes a chilling new embrace of facial recognition technology – specifically, to identify Pride participants (now officially designated as criminals) or likewise pick out faces from among the tens of thousands who are sure to illegally protest these new measures. At the moment, the punishment for such unconstitutional behavior is a fine of up to €500. Organizers, however, can be imprisoned for up to a year. But can even more draconian punishments be far behind?
 
If you’re wondering how Hungary’s democratic partners in the European Union are reacting to all of this, the answer is not well. And it’s also raising important questions about the efficacy of the EU’s AI regulations in general (a debate about loopholes and guardrails that merits a separate discussion).
 
For now, though, Americans should take in a cautionary warning from Hungary’s use of facial recognition software. Future uses of the technology here could target leaders of a MAGA or a Black Lives Matter protest. Facial recognition scans can pinpoint individuals, spotting the face in a crowd. It gives regimes the ability to come back later to arrest and persecute on a scale only Orwell could have conceived. All of this is enhanced by the unholy combination of data analytics, advanced algorithms, unprecedented computing power, and now generative AI.
 
The uncomfortable truth of the modern era is inescapable: The development and deployment of modern surveillance has gone hand in hand with modern authoritarianism, from Russia to China and Iran. Just imagine what might have happened if J. Edgar Hoover had access to facial recognition tech and AI. We imagine it would have looked like Orbán’s dystopian democracy.
 
Budapest Pride is not backing down, celebrating its 30th anniversary in a public demonstration in June. The world will be watching to see how this technology is used.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

AI and Data Consolidation Is Supercharging Surveillance

4/28/2025

 
Picture
​In Star Wars lore, it was the democratic, peace-loving Republic that built the first fleet of Star Destroyers. But the fleet was quickly repurposed for evil after the Republic fell. What was once a defensive force for good became a heavy-handed tool of occupation and terror.
 
In a galaxy closer to home, imagine the development of a fully integrated civilian computer system designed to help a technological democracy of 345 million people operate smoothly. In the early 21st century, successive governments on both the right and left embraced the idea that “data is the new oil” and began the process of digitizing records and computerizing analog processes. Generative artificial intelligence, vast increases in computing power, and the rise of unregulated data brokers made the creation of a single database containing the personal information and history of every citizen readily available to federal agencies.
 
At first, the system worked as advertised and made life easier for everyone – streamlining tax filing, improving public service access, facilitating healthcare management, etc. But sufficient guardrails were never established, allowing the repurposing of the system into a powerful surveillance tool and mechanism of control.
 
This scenario is now on the brink of becoming historical fact rather than cinematic fiction.
 
“Data collected under the banner of care could be mined for evidence to justify placing someone under surveillance,” warns Indiana University’s Nicole Bennett in a recent editorial for The Conversation. And if you like your social critiques with a side of irony, the Justice Department agreed with her in its December 2024 Artificial Intelligence and Criminal Justice report. It concluded that the AI revolution represents a two-edged sword. While potentially a driver of valuable new tools, its use must be carefully governed.
 
The Justice Department said that AI data management must be “grounded in enduring values. Indeed, AI governance in this space must account for civil rights and civil liberties just as much as technical considerations such as data quality and data security.”
 
Yet the government is proceeding at breakneck speed to consolidate disparate databases and supercharge federal agencies with new and largely opaque AI tools, often acquired through proprietary corporate partnerships that currently operate outside the bounds of public scrutiny.
 
Anthony Kimery of Biometric Update has described the shift as a new “arms race” and fears that it augers “more than a technological transformation. It is a structural reconfiguration of power, where surveillance becomes ambient, discretion becomes algorithmic, and accountability becomes elusive.”
 
The Galactic Republic had the Force to help it eventually set things right. We have the Fourth – the Fourth Amendment, that is – and the rest of the Bill of Rights. But whether these analog bulwarks will hold in the digital age remains to be seen. To quote Kimery again, we are “a society on the brink of digital authoritarianism,” where “democratic values risk being redefined by the logic of surveillance.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

What the Leaking of 21 Million Employee Screenshots Tells Us About the Threat of Worker Surveillance Apps

4/28/2025

 
Picture
​In the late 19th century, American business embraced the management philosophy of Frederick Winslow Taylor, author of The Principles of Scientific Management. He wrote: “In the past the man has been first; in the future the system must be first.”

So managers put their factory systems first by standardizing processes and performing time and motion studies with a stopwatch to measure the efficiency of workers’ every action. Nineteenth century workers, who were never first, became last.

Now intrusive surveillance technology is bringing this management philosophy to the knowledge economy. This entails not just the application of reductionism to information work, but the gross violation of employee privacy.

This was brought home when Paulina Okunyté of Cybernews reported on Thursday that WorkComposer, an employee surveillance app that measures productivity by tracking logging activity and regular screenshots of employees, left over 21 million images exposed in an unsecured bucket in Amazon’s cloud service.

WorkComposer also logs keystrokes and how much time an employee spends on an app. As a result, usernames and passwords that are visible in screenshots might enable the hijacking of accounts and breaches of businesses around the world.

“Emails, documents, and projects meant for internal eyes only are now fair game for anyone with an internet connection,” Okunyté writes.

With 21 million images to work with, there is plenty of material for cyberthieves and phishing scammers to victimize the people who work for companies that use WorkComposer software.

This incident exposes the blinkered philosophy behind employee surveillance. As we have reported, there are measurable psychological costs – and likely productivity costs – when people know that they are being constantly watched. Vanessa Taylor of Gizmodo reports that according to a 2023 study by the American Psychological Association, 56 percent of digitally surveilled workers feel tense or stressed at work compared to 40 percent of those who are not.

We also question the usefulness of such pervasive tracking and surveillance. Efficiency is a commendable goal. Surely there are broader and less intrusive ways to measure employee productivity. Such close monitoring runs the risk of focusing workers on meeting the metrics instead of bringing creativity or bursts of productivity to their jobs. Allowing people to take a break every hour to listen to a song on earbuds might, in the long run, make for better results and greater efficiency. 
​
Just don’t make a funny face or sing along, the whole world might see you.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Department of Justice Criminal Division Claims It Has No Policies Governing the Unmasking of Members of Congress

4/27/2025

 
Picture
​Batman isn’t the only one who needs to worry about “unmasking.” This is the term of art for when federal officials ask that an American’s personal, international communications be deanonymized. “Upstreaming” is the National Security Agency practice of working with companies like Verizon or AT&T to create backdoors into the internet backbone to use targeted keywords to collect the content of Americans’ communications.
 
The practice of unmasking rose from 198 instances in 2013 to 5,000 in 2020. As this increase occurred, the intent of these searches began to look more and more political.
 
In 2017, National Security Advisor Susan Rice issued unmasking orders for identities of transition team members for Donald Trump’s first administration. More troubling, U.N. Ambassador Samantha Power or someone in her office made hundreds of unmasking requests. Nearly 270 of these requests came days or even hours before Power’s service in government ended.
 
Some of these unmasking orders were not supported by any legitimate national security justification by Section 702. Many were not subjected to “minimization” procedures to ensure that private information was performed in as limited a way as possible.
 
PPSA has long sought to learn how unmasking and upstreaming might be used against Members of Congress with oversight responsibility over the intelligence community. So we filed FOIA requests with DOJ to seek answers, including records on potential spying on 48 current and recent Members of Congress, ranging from former Vice President Kamala Harris to now-Secretary of State Marco Rubio, Rep. Jim Jordan to Sen. Ron Wyden.
 
We’ve yet to receive a fulsome answer to our Freedom of Information Act requests (FOIA) seeking records reflecting policies governing the unmasking of Members of Congress. But the Criminal Division of the Department of Justice has now informed us in writing that “no responsive records subject to the FOIA were located.”
 
“In other words, the Criminal Division claims to have no policies on how it might warrantlessly tap into the identity of Members of Congress in international communications, and potentially the content of their communications,” said Gene Schaerr, general counsel of PPSA. “When agencies spy on the very people who are charged with their oversight, you might think that at least some policies would be in place.
 
“And you might also think – given that spying on Congress necessarily involves the civil rights of us all – that there would be some internal guardrails or training material,” Schaerr said. “But you would be wrong.”
 
PPSA will report any further revelations in our ongoing efforts to dig out more information on how the intelligence community might be spying on Congress.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Watching the Watchers: Pope Francis Warned Us of the “Rise of a Surveillance Society”

4/22/2025

 
Picture
​With the passing of Pope Francis, it seems appropriate to reflect on his statements regarding surveillance, privacy, and human rights. In his 2024 World Day of Peace message, the pontiff declared:

  • “Artificial intelligence, then, ought to be understood as a galaxy of different realities. We cannot presume a priori that its development will make a beneficial contribution to the future of humanity and to peace among peoples.”
 
  • “Privacy, data ownership and intellectual property are other areas where these technologies engender grave risks. To which we can add other negative consequences of the misuse of these technologies, such as discrimination, interference in elections, the rise of a surveillance society, digital exclusion and the exacerbation of an individualism increasingly disconnected from society. All these factors risk fueling conflicts and hindering peace.”
 
  • “Reliance on automatic processes that categorize individuals, for instance, by the pervasive use of surveillance or the adoption of social credit systems, could likewise have profound repercussions on the social fabric by establishing a ranking among citizens.”
 
The whole essay is worthy of our attention. It contains frank criticisms of the breakneck development of AI, as well as an important acknowledgement of China’s insidious “social credit” system, whereby its citizens are monitored and their behaviors graded.
 
Pope Francis himself had sufficient reason to be wary of surveillance states. Just a few weeks ago, the Vatican revealed that several of the pontiff’s senior aides discovered that foreign spy agencies had infected their smartphones with Pegasus spyware.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Warrants and the “Wild West” of Digital Surveillance

4/21/2025

 

Rep. Knott: “It’s Amazing to Me That There’s So Much Resistance to the Warrant Requirement”

Picture
​Perhaps you had other things to do during last week’s House Judiciary hearing, “A Continued Pattern of Government Surveillance of U.S. Citizens.” So here’s a summary: The Judiciary’s Subcommittee on Crime and Federal Government Surveillance brought together witnesses from across the political spectrum (including PPSA’s own Gene Schaerr) to identify potential solutions to the ongoing (and growing) problem of Fourth Amendment abuse by government entities.
 
At the heart of the discussion was the need to import probable cause warrants – the key requirement of the Constitution’s Fourth Amendment – to the practice of federal agencies freely accessing our international communications, as well as our personal, digital data.
 
Witnesses effectively rebutted the fearmongering campaign by the intelligence community to convince us that a warrant requirement for federal surveillance of American citizens is too onerous, and too dangerous to entertain. But the most effective remarks came from a Member of the committee.
 
Rep. Brad Knott (R-NC), a former U.S. Attorney for the Eastern District of North Carolina, addressed the issue of warrant requirements with the assurance of a former federal prosecutor. He spoke of what it took for him to get permission to “flip the switch” on some of the most “intrusive” forms of wiretapping American citizens.
 
“So you have to demonstrate necessity,” Rep. Knott said. “You have to demonstrate why other techniques are futile … the rigor we had to exercise was very important … it kept the internal investigators accountable.”
 
Rep. Knott said the warrant process made sure investigations were “open and honest.” Investigators knew “that their actions were going to be subject to pen and paper. They were going to be subject to judicial review … and opposing counsel.”
 
Given the clarity and accountability added by warrants, Rep. Knott added:
 
“It’s amazing to me that there’s so much resistance to the warrant requirement alone.”
 
Throughout the 90-minute hearing, Members and witnesses stressed one thing:
 
The countdown clock is ticking on what may be our last, best chance at meaningful reform – including the adoption of a warrant requirement for U.S. citizens when Section 702 of the Foreign Intelligence Surveillance Act (FISA) comes up for renewal next year (it’s due to sunset in April 2026).
 
Section 702 is the legal authority that allows federal intelligence agencies to spy on foreign targets on foreign soil. But it also “incidentally” picks up the international communications of Americans, which can then be warrantlessly inspected by the FBI and other agencies.
 
Section 702 got a lot of airtime at the hearing and was frequently linked with the words “loophole” and “backdoor.” The Reforming Intelligence and Securing America Act (RISAA) of 2024 attempted to fix Section 702 – and did add some useful reforms – but it also left a loophole in which the FBI and others attempt to justify warrantless backdoor searches on Americans’ private communications.
 
For the FBI in particular, this has become the go-to means to warrantlessly develop domestic leads.
 
“Three million times they did [backdoor searches] in 2021,” lamented Judiciary Chairman Jim Jordan (R-OH). Or, as James Czerniawski of Americans for Progress, put it: “Time and time again we have caught the intelligence community with their hand in the constitutional cookie jar.”
 
Members and witnesses alike also addressed a privacy crisis even greater than Section 702 – the routine purchases made by federal agencies of Americans’ private digital information from data brokers.
 
ACLU’s Kia Hamadanchy reminded the subcommittee that the kind of data that can be bought and sold would be, in the words of a former CIA deputy director, “top secret” sensitive if gathered by traditional intelligence means. It would have to be kept “in a safe,” not in a database.
 
The hearing also got at what many consider the underlying issue driving the new era of surveillance. Namely, the acknowledgment that we increasingly live not in one world, but two – our physical reality and its digital twin. But unlike our world, the laws governing how the Fourth Amendment should be applied in the digital context are largely unwritten. In other words, said Rep. Andy Biggs (R-AZ), it’s the “Wild West.”
 
And Ranking Member Rep. Jamie Raskin (D-MD) added, “New technologies make it a lot harder to reign in government intrusion in the lives of the people.” The unwitting result? “We live in a modern, albeit consensual, surveillance state,” declared Phil Kiko, principal at Williams & Jensen and former Judiciary counsel.
 
With any luck, things might be different a year from now when FISA is up for renewal, thanks to a U.S. District Court ruling in January.
 
“To countenance this practice,” of warrantless surveillance, wrote the court, “would convert Section 702 into … a tool for law enforcement to run ‘backdoor searches’ that circumvent the Fourth Amendment.”
 
That legal precedent didn’t exist when the last Congress debated FISA reforms. Emboldened by this landmark decision, Reps. Jordan and Raskin are pledging to once again work together in a bipartisan spirit to win this fight. Their continuing partnership captures the spirit of the subcommittee’s hearing and should give reformers a renewed sense of hope.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

How AI Can Leak Your Data to the World

4/21/2025

 
Picture
​As we labor to protect our personal and business information from governments and private actors, it helps to think of our data as running through pipes the way water does. Just like water, data rushes from place to place, but is prone to leak along the way. Now, as the AI revolution churns on, workplaces are getting complete overhauls of their data’s plumbing. Some information leaks are thus almost inevitable. So, just as you would do under a sink with a wrench, you should be careful where you poke around.
 
A major new source of leakage is conversational AI tools, which are built on language in all its forms – words and sentences, but also financial information, transcripts, personal records, documents, reports, memos, manuals, books, articles, you name it. When an organization builds a conversational AI tool, many of these source items are proprietary, confidential, or sensitive in some way. Same with any new information you give the tool or ask it to analyze. It absorbs everything into its big, electronic, language-filled brain. (Technically, these are called “large language models,” or LLMs, but we still prefer “big, electronic, language-filled brains.”)
 
So be careful where you poke around.
 
As Help Net’s Mirko Zorz reminds us, companies should give employees clear guidelines about safely using generative AI tools. Here is our topline advice for using AI at work.

  • If it’s a public tool like ChatGPT, absolutely no confidential personal or business information should be entered that isn’t already publicly available. Just ask Samsung about their misadventure.
 
  • Even internal, private company tools carry risk. Make sure you’re authorized to access the confidential information your system contains. And don’t add any additional sensitive information either (documents, computer code, legal contracts, etc.) unless you’re cleared to do so.
 
  • Like a person, LLMs can be “tricked” into disclosing all manner of sensitive information, so don’t give your credentials to anyone who does not have the same authorization as you. Those new employees from Sector 7G? Sure they seem nice and perfectly harmless, but they could be corporate spies (or more likely, just untrained). Don’t trust them until they’re vetted.
 
  • Any company that isn’t educating its employees on how to use AI tools acceptably is asking for trouble. If your company isn’t training you or at least providing basic guidelines, demand both. Vigilant employees are the last line of defense in any organization that doesn't bring its “A” game to AI. And “A” really is the operative letter here (we’re not just being cute). Authorization and Authentication are the bywords of any IT organization worth its salt in the AI space.
 
  • Just because an approved software program you’ve been using at work for years has suddenly added an AI feature does NOT mean it’s safe to use. Consult with your IT team before trying the new feature. And until they give you the all-clear, be sure to avoid inputting any sensitive or otherwise restricted information.
​
Finally, leave everything work-related at work (wherever work is). When elsewhere, don’t use your work email to sign into any of the tens of thousands of publicly available AI applications. And never upload or provide any personal or private information that you don’t want absorbed into all those big, electronic, language-filled brains out there.
 
Because leaks are nearly inevitable.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

How to Guard Against Smishing Scams from China

4/21/2025

 
Picture
​Like millions of other Americans, we are receiving text messages telling us that someone at a company’s HR department has noticed our very impressive resume and would like to discuss a job offer, call before the job’s filled! – or, we have an unpaid highway toll and must pay quickly to avoid a fine! – or, our package delivery has hit a snafu and we need to deal with it post haste, or it might get lost forever!
 
The FBI advises us to delete such texts and to never – as in NEVER!!! – click through them. Such messages aim to persuade you to add to the hundreds of millions of dollars Americans are losing to text scams every year from sophisticated gangs in China. As Americans become wary of these smishing scams (a portmanteau of “SMS” short-message service texts and “phishing”), criminals are becoming more sophisticated, often impersonating a credible brand or agency to make you think that you must provide your credentials, account numbers, Social Security number, or make a payment in order to avoid a severe penalty.
 
And if you do click through, you may also expose your phone to a malware infection that will endanger you long after the text is forgotten.
 
One telltale sign of a smishing scam is that the link points to a foreign top-level domain. Common ones are “com-track,” and “com-toll.” But China’s smishing gangs are getting good at embedding links in actual “.com” addresses for real brands and agencies. So always assume it is a scam.
 
What should you do if you receive such a suspicious text?
 
The FBI advises: “STOP! Take a moment to breathe deeply in and out.”
 
Again, NEVER!!! open the text.
 
Write down the issue on paper and delete the text.
 
And if you still have a tingle of doubt, go online and look up the main website and customer service number of the bank, delivery company, toll authority, or whatever, and ask them.
 
But you do have an impressive resume, by the way. Click here to learn more.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

250 Years Later: A History Lesson from the American Revolution

4/19/2025

 

Are We Guarding Their Sacred Trust?

Picture
​April 19, 2025, is the 250th anniversary of the American Revolution. It’s a story most Americans know pretty well, but here at PPSA we’d like to highlight one of the more obscure portions of that history, but one that is of ultimate importance to the all-but-impossible dream the Revolution would eventually make real: The Bill of Rights.
 
The subject of today’s lesson? General warrants. If that doesn’t ring a bell, then be glad, because that means the Bill of Rights largely did its job. General warrants were one of the primary tools of tyranny King George used to oppress, even terrorize, the Colonists. Armed with general warrants, the Crown’s agents could search anywhere they wanted, for anything they wanted, and for any reason — or, even worse, for no reason at all.
 
General search warrants don’t name a specific person or place and don’t state what the authorities are looking for – making it possible to target people without reason or cause, and almost without limits. As you can imagine, such writs were widely abused. To quote the Declaration of Independence, the King “sent hither swarms of Officers to harass our people and eat out their substance.” Barging into homes, destroying property, searching belongings, and seizing whatever they wanted. And not just homes – shops, ships, banks, churches. Americans had had it.
 
And on April 19, 1775, they said enough was enough. And they meant it. Sixteen years and 8 months would pass between that day and the day the Fourth Amendment was ratified. The Fourth Amendment exists because it was, and is, the best answer to the outrageous indignity of general warrants. That’s what historians call a “direct line.”
 
It's appropriate on this occasion to also recall a recent historical reminder from Rep. Jamie Raskin, a Democrat who happens to represent a district from Maryland, one of the thirteen original colonies – “The Old Line State” – a moniker earned in blood defending Washington’s army on multiple occasions. 
 
Speaking recently at a House Judiciary Subcommittee hearing on government surveillance, Raskin quoted James Madison: “The essence of government is power; and power, lodged as it must be in human hands, will ever be liable to abuse.” It’s no accident of history that Madison drafted what would become the Fourth Amendment. Two and a half centuries later, patriots of all stripes are called once again to hold the line against a modern, invasive, and warrantless surveillance state.
 
That we are still battling unlawful searches and seizures suggests, in a sense, that some things never change. But it also proves the timeless wisdom of those original ideas – that some things should endure. In the Fourth Amendment and Bill of Rights, the Founders left us a sacred trust. “The right to be let alone,” wrote Justice Louis Brandeis, is “the most comprehensive of rights and the right most valued.”

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

More Experts Weigh in on Warrantless Searches at Border Zones

4/15/2025

 

It’s Beyond Ridiculous that We Have to Worry About This

Picture
​With the summer travel season imminent, the already hot (and recently explored) topic of warrantless searches at U.S. borders and ports of entry keeps getting hotter by the day. The latest twist comes from ZDNET, where David Berlind asks the age-old question: Biometric vs. Passcode?
 
What, you were expecting “Plastic vs. Paper?”
 
Seriously, it’s come to this: How do American citizens best thwart their own government from its attempts to violate our constitutional rights? Specifically, how do citizens prepare against warrantless searches of their personal devices at border crossings, as Customs and Border Patrol agents seem increasingly determined to carry out?
 
The CliffsNotes version of ZDNET’s advice: The spoken word still matters (for now) relative to the Constitution, as in, “No person … shall be compelled in any criminal case to be a witness against himself.” Speech existed when the Constitution was written; biometric tech (fingerprint scanning, facial recognition, etc.) did not.
 
Put another way, being pressured to verbally recite your passcode could be construed as self-incrimination. So it is easier to refuse a request to speak it than to stand still and have your face open your device. But this much is sure: biometrics aren’t spoken, so that line to the Fifth Amendment is dotted at best. The same goes for Miranda. “The right to remain silent” is predicated on you actually remaining silent.
 
As for the Fourth Amendment itself, the Supreme Court has yet to meaningfully clarify its 1985 declaration that the Fourth’s “balance of reasonableness is qualitatively different at the international border than in the interior.” In practice, this means warrantless searches of your devices coming through customs is allowed. Among the many unanswered questions, what constitutes a “routine” search?
 
Is the biometric vs. passcode distinction a completely absurd technicality straight out of Monty Python? You bet your sweet privacy it is. But it’s also a gray area of unsettled law, so technicalities are currently one of our last defenses against this particular strain of government intrusion.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Frankenstein Needs a New Pair of Shoes

4/14/2025

 

And He May Steal Part of Your Identity to Buy Them

Picture
​There’s a relatively new twist in identity theft – synthetic identity theft, meaning the individual elements of the fake identity are either stolen from multiple victims or fabricated. Because none of the pieces are from the same victim, it’s like building a new person out of the spare parts of others – hence Frankenstein.
 
What’s the appeal? From the fraudster’s perspective, the Frankenstein approach offers numerous advantages over traditional identity theft (where a single, real person’s whole identity is stolen). The two biggest advantages are:

  1. It’s much harder to detect: Because the identity doesn’t belong to a single real person, there’s no one to notice suspicious activity right away. Alerting tools may miss them too, especially if some of the data used is associated with someone whose credit file is inactive (like a child, an elderly person, even the homeless – all folks who are highly unlikely to check the status of their credit).
    ​
  2. Frankensteins build their own credit: Fraudsters using these accounts may apply for small lines of credit and pay on time, slowly raising the profile’s creditworthiness before eventually going after the bigger prizes. Along the way, no one’s complaining to the bank or credit bureaus about being denied because when the theft is so fragmented, no one notices. In the end, it’s about patience and playing the long game. Scammers taking that approach are far more likely to succeed.
 
CNET’s Neal O’Farrell says the way to watch out for this kind of identity theft is to keep an eye on your Social Security Number. Phone numbers and addresses can change; SSNs are static. So if Frankenstein’s SSN happens to be yours, well, you get the picture. O’Farrell specifically recommends these steps:

  1. Freeze your credit reports. It’s both free and easy to do. And if you need to temporarily unfreeze your credit reports for a legitimate reason, that’s even easier.

  2. Monitor your SSN. And the best way to do that is to create a “my Social Security” account. Launched in 2012 and recently beefed up from a credentials standpoint, set a reminder on your calendar and check in once a month. It’s not proactive, so it won’t alert you, but checking it regularly does allow you to see if the activity associated with your SSN looks normal. And creating an account yourself prevents anyone else from doing it, so there’s that.  
    ​
  3. Check your credit reports regularly. Weekly would be nice. Note that we’re not talking about checking your credit SCORE, we’re talking about checking your credit REPORTS. There are multiple ways to go about this. CNET explains the options.
 
Finally, consider the Federal Reserve Toolkit devoted to this subject, specifically, the Fed’s Synthetic Identity Fraud Mitigation Toolkit. Aimed primarily at businesses and the payment industry, it contains plenty of information of value to any audience, including individuals and families. We asked them to rename it the “Frankenstein Identity Fraud Mitigation Toolkit,” but you can imagine how that went.
 
File all of the above under the folder named, “Reality, New.” We agree that it’s something of a pain, but ultimately it’s just about forming a few new habits.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS

Traveling Abroad This Summer? Think Twice about Bringing Devices

4/14/2025

 

The ACLU’s Updated Travel Advice with Privacy in Mind

Picture
​Traveling with electronic devices this summer? Of course you are.
 
Would you like those devices searched by federal agents? Of course not.
 
Think the Fourth Amendment protects you from such searches? Think again, says the ACLU.
 
As we’ve written previously, U.S. ports of entry are twilight zones where the Fourth Amendment is more of a suggestion than a right. Having monitored this issue for years, the ACLU recently updated their advice for travelers. Here’s a summary version from the ACLU:

  • Limit devices and data: Don’t take them if you don’t need them, or consider travel-only devices that don’t contain sensitive information.
  • Encrypt & ship: Instead of packing them, ship them, but realize that Customs and Border Protection (CBP) reserves the right to search international packages – so encrypt them. A forensic search could get around some encryption, but it’s still good to play defense.
  • Encryption is the new reality: While we’re on the subject, adjust your thinking and embrace encryption as something you need to do, not just the techie-crypto crowd. Here’s a guide recommended by the ACLU. There are many others, of course.
  • Is this thing on? Active devices are suspicious devices. If you insist on traveling with them (see above) turn them off when crossing. If on (not recommended), then airplane mode only.
  • Leave it in the cloud: Get out of the habit of storing sensitive or private information on your devices. Use end-to-end encrypted cloud-storage accounts instead. Then disable those apps for traveling and delete the caches. CBP claims it is against policy for border agents to search cloud-stored data on electronic devices. Fingers crossed.
  • About those photos: It’s cloud time again. Digital cameras don’t offer encrypted storage, so upload and delete if you’re worried. And, yes, digital cameras are considered electronic devices.
  • “Privileged” travelers? We mean actual privilege, as in the attorney-client kind. Do not volunteer this information, but if agents announce they’re going to search, first let them know the device contains privileged material. In such cases, CBP is supposed to follow special procedures that set a higher bar. Again, fingers crossed.
​
CBP agents can’t force you do anything (surrender a password, for example), but if you lock horns then you’d better be prepared to stay at the airport awhile or at least say goodbye to your electronic devices for weeks or even months.
 
This is all a pain. But the better strategy is to plan ahead.

    STAY UP TO DATE

Subscribe to Newsletter
DONATE & HELP US PROTECT YOUR PRIVACY RIGHTS
<<Previous

    Categories

    All
    2022 Year In Review
    2023 Year In Review
    2024 Year In Review
    Analysis
    Artificial Intelligence (AI)
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Appeals
    Court Hearings
    Court Rulings
    Digital Privacy
    Domestic Surveillance
    Facial Recognition
    FISA
    FISA Reform
    FOIA Requests
    Foreign Surveillance
    Fourth Amendment
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Government Surveillance Reform Act (GSRA)
    Insights
    In The Media
    Lawsuits
    Legal
    Legislation
    Letters To Congress
    NDO Fairness Act
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    Protect Liberty Act (PLEWSA)
    Saving Privacy Act
    SCOTUS
    SCOTUS Rulings
    Section 702
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology
    The GSRA
    The SAFE Act
    Warrantless Searches
    Watching The Watchers

    RSS Feed

FOLLOW PPSA: 
© COPYRIGHT 2024. ALL RIGHTS RESERVED. | PRIVACY STATEMENT
Photo from coffee-rank