Imagine this scenario: It’s early evening, and you and your special someone are on the couch preparing to binge-watch your favorite streaming show.
Ding-dong. You answer the door and, as you hoped, it is the dinner delivery person. He hands you your prepaid, pre-tipped meal and you start to shut the door when the delivery worker puts his foot down, blocking you. He snaps a picture over your shoulder and asks: “Why is the wall over your couch bare? It should have a picture of the Dear Leader. I now have no choice but to report you.” This fantastical scenario of a police state enlisting food delivery workers as auxiliary police is taking place, for real, in the People’s Republic of China, according to disturbing reports from Radio Free Asia. Beijing recently posted a directive: “We will hire a group of online delivery personnel with a strong sense of responsibility to serve as part-time social supervisors and encourage them to take part in grassroots governance through snapshots and snap reports …” Radio Free Asia reports that this program is being expanded in China’s annexed territory of Tibet, where food delivery workers are being recruited to perform “voluntary patrol and prevention work.” In addition, Chinese police are requiring Tibetans to revise their personal passwords on their social media accounts, link them to their personal cellphones and identity cards, and make it all accessible to the government. Police are also stopping Tibetans in Lhasa to check their cellphones for virtual private networks, or VPNs, that allow users to get around the “Great Firewall of China,” the government’s restrictive controls on the internet. We can shake our heads and laugh. But the fundamental principle of coopting private-sector industries for internal surveillance is one that is gaining purchase in our own country. The federal government isn’t so crude as to turn the Domino’s pizza delivery guy into a spy. But federal agencies can extract Americans’ personal data from FISA Section 702, even though this program was enacted by Congress not to spy on Americans, but to surveil foreign threats on foreign soil. Prosecutors in the United States can extract information about witnesses and criminal defendants from telecoms and service providers of emails, cloud computing, and online searches, then gag those same companies with a non-disclosure order, which keeps them from ever informing their customers they were surveilled. The good news is that more and more Members of Congress are awakening to the threat of a home-grown American surveillance state. The recent reauthorization of Section 702 sets up a debate over the reach of this program in early 2026. The House passed a measure called the NDO Fairness Act, which would limit non-disclosure orders, putting the onus on the Senate to follow suit. The field of surveillance is one area in which public-private partnerships can go very wrong. Unlike China, however, America is still a democracy with a Congress that can counter expansive government threats to our privacy. The U.S. Supreme Court will almost certainly take up and resolve two furthest – some would say extreme – rulings by the Fourth and Fifth Circuit Courts of Appeals on the Fourth Amendment implications of geofence searches.
The Fourth Circuit ruled that geofence warrants – which search the mobile devices of many people in designated areas – contain no Fourth Amendment implications. The Fifth Circuit ruled that geofence warrants are inherently unconstitutional. This is the Grand Canyon of circuit splits. At stake are not just geofence warrants, but conceivably almost every kind of automated digital search conducted by the government. At stake, too, is the very meaning and viability of the Fourth Amendment in the 21st century. We had previously reported on the gobsmacking ruling of the Fourth Circuit in July that held that a geofence warrant to identify a bank robber within a 17.5-acre area – including thousands of innocent people living in apartments, at a nursing home, eating in restaurants, and passing by – did not implicate the privacy rights of all who were searched. In United States v. Chatrie, the court held in a split opinion that this mass geofence warrant had no Fourth Amendment implications whatsoever. In doing so, the Fourth reversed a well-reasoned opinion by federal Judge Mary Hannah Lauck, who wrote that citizens are almost all unaware that Google logs their location 240 times a day. Judge Lauck wrote: “It is difficult to overstate the breadth of this warrant.” The same overbreadth can be seen, in a very different context, in the Fourth Circuit’s jettisoning of the Fourth Amendment in its reversal. Now the Fifth Circuit Court of Appeals has weighed in on a similar case, United States v. Jamarr Smith. The Fifth came to the opposite conclusion – that geofence warrants cannot be reconciled to the Fourth Amendment. Orin Kerr of the UC Berkeley School of Law argues that the Fifth’s ruling conflicts with Supreme Court precedent, including Carpenter v. United States, in which the Court held that the government needs a warrant to extract cellphone location data. Kerr also asserts that the lack of particularity in which a suspect’s identity is not known at the beginning of a search (indeed, that’s the reason for these kind of searches) is a well-established practice recognized by the Supreme Court. Jennifer Granick and Brett Max Kaufman of the American Civil Liberties Union push back at Kerr, finding the digital inspection of the data of large numbers of people to identify a needle-in-a-haystack suspect is, indeed, a “general warrant” forbidden by the Constitution. They write: “Considering the analog equivalents of this kind of dragnet helps explain why: For example, police might know that some bank customers store stolen jewelry in safe deposit boxes. If they have probable cause, police can get a warrant to look in a particular suspect’s box. But they cannot get a warrant to look in all the boxes – that would be a grossly overbroad search, implicating the privacy rights of many people as to whom there is no probable cause.” The implications of this circuit split are staggering. If the Fourth Circuit ruling prevails, it will be anything goes in digital search. If the Fifth Circuit’s ruling prevails, almost any kind of digital search will require a probable cause warrant that has the particularity the Constitution clearly requires. There will be no way for the U.S. Supreme Court to reconcile these opposite takes on digital warrants. It will be up for the Court to set a governing doctrine, one that examines at its root what constitutes a “search” in the context of 21st century digital technology. Let us hope that when it does so, the Supreme Court will lean toward privacy and the Fourth Amendment. “Quiet Skies” is a federal aviation security program that includes singling out flyers for close inspection by giving them an “SSSS” or “Secondary Security Screening Selection” designation on their boarding pass. In the case of Tulsi Gabbard, it is alleged she was also put on a “terror threat list” that requires that she receive intense surveillance as well.
You probably know Gabbard as an outspoken and iconoclastic former U.S. Representative from Hawaii who ran for president. During a slew of domestic flights after returning from a recent trip to Rome, Gabbard and husband Abraham Williams were allegedly designated as security threats requiring enhanced observation. A war veteran of Iraq who signed up after 9/11, Gabbard told Matt Taibbi of The Racket that she and her husband are getting third-degree inspections every time they go to the airport. Every inch of her clothes is squeezed. The lining of the roller board of her suitcase is patted down. Gabbard has to take out every personal electronic and turn on each one, including her military-issue phone and computer. This process can take up to 45 minutes. What may be happening in the air is far more worrisome. Sonya LaBosco, executive director of the advocacy group Air Marshals National Council, is the source that told Taibbi that Gabbard is on the TSA’s domestic terror watch list. Every time someone on that list travels, LaBosco said, that passenger gets assigned two Explosive Canine Teams, one Transportation Security Specialist in explosives, and one plainclothes TSA Supervisor. Such passengers are assigned three Federal Air Marshals to travel with them on every flight. LaBosco says that Gabbard’s recent three-flight tour would have required no fewer than nine Air Marshals to tail her and her husband. Taibbi writes that an Inspector General’s report in 2019 revealed one-half of the Air Marshal’s budget is wasted, and that much of $394 million in funds for air security are put to questionable use. In our personal experience, the “SSSS” designation can be randomly assigned. Judging from publicly available sources, that designation can also be algorithmically triggered by a host of activities deemed suspicious, such as flying out of Turkey, paying cash for plane tickets, and buying one-way tickets. (We can only imagine what would happen to the brave or foolhardy person who bought a one-way ticket out of Istanbul with cash.) To be fair, many complaints about the TSA that seem absurd have a basis in hard experience. That experience goes back to 1986, when an extra close inspection by El Al security officers of a pregnant Irish nurse flying to meet her boyfriend in Jordan revealed that he had betrayed her by secreting a bomb in her bag. TSA has to contend with the fact that anyone – a decorated war hero, a handicapped grandmother, a toddler – could be the unknowing carrier of a threat. But the treatment of Gabbard raises the unavoidable question if this outspoken political figure was put on the SSSS list out of political pique. Gabbard has certainly irritated a lot of powerful people and agencies. In Congress, she advocated for dropping charges against Edward Snowden. As vice chair of the Democratic National Committee in 2016, she publicly criticized the party’s reliance on superdelegates and endorsed Bernie Sanders over Hillary Clinton. She later left the Democratic Party and was recently on the list of Donald Trump’s possible vice-presidential candidates. She has been a consistent critic of “elites” who want “nation-building wars.” Gabbard found herself on the threat list just after she left Rome where she had called Vice President Kamala Harris “the new figurehead for the deep state.” You might find Gabbard insightful or a flittering gadfly, but no one should be targeted for surveillance for merely expressing controversial views. And if Gabbard did somehow inadvertently trigger a threat algorithm, one has to wonder if anyone is in charge with the ability to apply common sense – if, in fact, such vast resources are being deployed to follow her. If that is true, even the most benign explanation reveals a diversion of manpower (and dogpower) that could be used to deter real threats. A Congressional investigation – perhaps by the Weaponization of the Federal Government subcommittee – is warranted to discover if the facts reported by Taibbi are correct and, more importantly, if Gabbard has been targeted for enhanced surveillance and harassment for her speech. After all, crazier things have happened, like Matt Taibbi finding himself targeted with a rare home visit from the IRS on the same day the journalist testified before Congress about federal meddling in social media curation. Police have access to more than 71,000 surveillance cameras in New York City, and to more than 40,000 cameras in Los Angeles.
This technology is rapidly becoming ubiquitous from coast to coast. As it does, civil libertarians are shifting from outright opposition to public surveillance cameras – which increasingly seems futile – to advocating for policy guardrails that protect privacy. That American cities are going the way of London, where police cameras are on every street corner, is undeniable. The Harvard Crimson reports that Cambridge, Massachusetts, is one of the latest cities to debate whether to allow police to deploy a network of surveillance cameras. The Cambridge Police Department was on the verge of installing highly visible cameras that would surveil the city’s major parks and even Harvard Yard when the city council suspended a vote after hearing from a prominent civil rights attorney. Even then, Emiliano Falcon-Morano of the Technology for Liberty Program at the Massachusetts ACLU seemed to bow to the inevitability of cameras. He recommended that this technology not be installed until the “police department addresses several important questions and concerns to ensure that it is deployed in a manner that conforms with civil rights and civil liberties.” In Philadelphia Dana Bazelon, a former criminal defense attorney and frequent critic of police intrusions into privacy, is now advocating the expansion of surveillance cameras. As an advisor to the Philadelphia district attorney, Bazelon sees police cameras as the only way to stem gun violence. This turnabout prompted Reason’s J.D. Tuccille to accuse Bazelon of discarding “concerns about government abuses to endorse a wide-reaching surveillance state.” Tuccille notes how much easier surveillance cameras may make the job of policing. He archly quotes Charlton Heston’s character in Touch of Evil, “A policeman’s job is only easy in a police state.” The argument in favor of public surveillance cameras is that when we step into the public square, we can expect to lose a degree of privacy. After all, no law keeps an officer on patrol from glancing our way. What’s so bad about being seen by that same officer through a lens? The answer, simply, is that camera networks do more than see. They record and transform faces into data. That data, combined with facial recognition software, with cellsite simulators that record our movements by tracking our cellphone location histories, with social media posts that log our political views, religious beliefs, and personal lives, brings us to within spitting distance of a police state. It is out of this concern that the Electronic Frontier Foundation has helpfully provided Americans with the ability to see the surveillance mechanisms unfolding in their communities through its Street Level Surveillance website. Yet, whether we like it or not (and we don’t like it), ubiquitous camera surveillance by the police in almost every city is coming. It is coming because public surveillance is useful in solving so many crimes. As city leaders temporarily shelved the police surveillance proposal in Cambridge, a man in New York City was freed after serving 16 years in prison, exonerated by evidence from old surveillance footage. Arvel Marshall was railroaded in 2016 by a Brooklyn prosecutor who sat on the exonerating tape, which clearly showed someone else committing the murder for which Marshall was convicted. There is no denying that, when the images are clear, surveillance footage can provide irrefutable identification of a criminal (or not, as in Marshall’s case). But the flip side is that the same technology, once it becomes networked and seamlessly integrated by AI, will give the powerful the means to track Americans with little more than a snap of the fingers or a click of the mouse – not just criminals, but protestors, political groups, journalists, and candidates. As this new reality unfolds, questions emerge. How will police surveillance data be stored? How secure will it be from hackers? How long will it be kept? Will it be networked with other forms of tracking, such as our purchased digital data, and combined by AI into total personal surveillance? Will this data be used to follow not just potential terrorists but Americans with criminal records in a predictive effort at “precrime”? Should technology be deployed that anonymizes the faces of everyone on a tape, with deanonymization or unmasking only at the hands of an authorized person? Should a warrant be issued to watch a given crime or to unmask a face? The terms of this new debate are changing as technology evolves at fast forward. But it is not too early to ask these questions and debate new policies, city by city, as well as in Congress. U.S. intelligence agencies justify tens of thousands of warrantless backdoor searches of Americans’ communications by claiming an exception to the Fourth Amendment for “defensive” purposes.
In testimony to Congress, FBI Director Christopher Wray has said that such defensive searches are absolutely necessary to protect Americans in real time who may be potential victims of foreign intelligence agents or cyberattacks. On this basis, the FBI and other agencies every year conduct tens of thousands of warrantless “backdoor” searches of Americans’ communications with data extracted from programs authorized by FISA Section 702 – even though this program was enacted by Congress not to spy on Americans, but to authorize U.S. agencies to surveil foreign spies and terrorists located abroad. Noah Chauvin, Assistant Professor of Law at Widener University School of Law, in a 53-page paper neatly removes every leg of the government’s argument. He begins with the simple observation that there is no “defensive” exception in the Fourth Amendment. Indeed, an analogous claimed exception for “community caretaking” was rejected by the U.S. Supreme Court in the 2021 decision on Caniglia v. Strom, holding that the government could not enter a home without a warrant based on the simple, non-exigent claim that the police needed to check on the homeowner’s well-being. Whether for community caretaking or for surveillance, the “we are doing this for your own good” excuse does not override the Fourth Amendment. In surveillance, the lack of constitutional validity makes the government’s position “a political argument, not a legal one.” Chauvin adds: “It would be perverse to strip crime victims of the Fourth Amendment’s privacy protections – a person should not lose rights because they have been violated.” It is apparently on the basis of such a “defensive search,” for example, that the FBI violated the Fourth Amendment rights of Rep. Darin LaHood (R-Ill). In that case, the FBI was concerned that Rep. LaHood was being unknowingly targeted by a foreign power. If the FBI can secretly violate the rights of a prominent and respected Member of Congress, imagine how blithely it violates your rights. While making these sweeping claims of violating the Fourth Amendment to protect Americans, “the government has provided almost no public information about how these defensive backdoor searches work.” Chauvin adds: “The government has claimed it uses backdoor searches to identify victims of cyberattacks and foreign influence campaigns, but has not explained how it does so, saying only that backdoor searches have ‘contributed to’ or ‘played an important role in’ intelligence services.” Also unexplained is how the government identifies potential American victims, or why it searches for victims instead of potential perpetrators. Nor does it reveal its success rate at identifying potential victims and how that compares to traditional methods of investigation. Finally, Chauvin asks: “Would obtaining permission before querying a victim compromise the investigation?” It is a matter of settled law that any American can give informed consent to waive his or her Fourth Amendment rights. “It seems particularly likely,” Chauvin writes, “that would-be victims will grant the government permission to perform defensive backdoor searches.” One can easily imagine a long list of companies – from hospitals to cloud providers – that would grant such blanket permission. So why not just do that? Finally, Chauvin appeals to Congress not just to remedy this backdoor search loophole for Section 702. He proposes closing this loophole for Americans’ digital data that U.S. intelligence and law enforcement agencies purchase from third-party data brokers, as well as for Executive Order 12333, a non-statutory surveillance authority claimed by the executive branch. At the very least, Congress should demand answers to Chauvin’s questions about how defensive searches are used and how they work. He concludes, “the government’s policy preferences should never override Americans’ constitutional rights.” Earlier this year, students in a high school art class were called to a meeting of administrators to defend the contents of their art portfolio.
This happened after Lawrence High School in Lawrence, Kansas, signed a $162,000 contract with Gaggle safety software to review all student messages and files for issues of concern. Gaggle had flagged the digital files of the students’ art portfolio for containing nudity. The students vehemently protested that there was no nudity at all in their work. But it was a hard case to make considering that the files had already been removed from the students accounts so the student artists themselves couldn’t refer to it. Max McCoy, a writer with the nonprofit news organization The Kansas Reflector, wrote that if you’re a Lawrence High student, “every homework assignment, email, photo, and chat on your school-supplied device is being monitored by artificial intelligence for indicators of drug and alcohol use, anti-social behavior, and suicidal inclinations.” The same is true of many American high schools from coast-to-coast. Gaggle claims to have saved an estimated 5,790 student lives from suicide between 2018 and 2023 by analyzing 28 billion student items and flagging 162 million for reviews. McCoy took a hard look this incredibly specific number of lives saved, finding it hard to validate. Simply put, Gaggle counts each incident of flagged material that meets all safety criteria as a saved life. Still, it is understandable that school administrators would want to use any tool they could to reduce the potential for student suicide (the second-leading cause of death among Americans 15-19), as well as reduce the threat of school violence that has plagued the American psyche for decades now. But is an artificial surveillance regime like Gaggle the way to do it? McCoy likens Gaggle to the science-fictional “precrime” technology in the Philip K. Dick novel and Stephen Spielberg movie Minority Report. But could Gaggle technology in its actual use be more like the utterly dysfunctional totalitarian regime depicted in the classic movie Brazil? McCoy reports that a cry for help from one student to a trusted teacher was intercepted and rerouted to an administrator with whom the student has no relationship. The editors of the Lawrence student paper, The Budget, are concerned about Gaggle’s intrusion into their newsgathering, notes, and other First Amendment-protected activities. McCoy quotes Rand researchers who recently wrote, “we found that AI based monitoring, far from being a solution to the persistent and growing problem of youth suicide, might well give rise to more problems than it seeks to solve.” It is one thing to keep tabs on student attitudes and behavior. Spyware technology over all student messages and content looks pointlessly excessive. Worse, it trains the next generation of Americans to be inured to a total surveillance state. The Quick Unlocking of Would-Be Trump Assassin’s Phone Reveals Power of Commercial Surveillance7/18/2024
Since 2015, Apple’s refusal to grant the FBI a backdoor to its encrypted software on the iPhone has been a matter of heated debate. When William Barr was the U.S. Attorney General, he accused Apple of failing to provide “substantive assistance” in the aftermath of mass shootings by helping the FBI break into the criminals’ phones.
Then in a case in 2020, the FBI announced it had broken into an Apple phone in just such a case. Barr said: “Thanks to the great work of the FBI – and no thanks to Apple …” Clearly, the FBI had found a workaround, though it took the bureau months to achieve it. Gaby Del Valle in The Verge offers a gripping account of the back-and-forth between law enforcement and technologists resulting, she writes, in the widespread adoption of mobile device extraction tools that now allow police to easily break open mobile phones. It was known that this technology, often using Israeli-made Cellebrite software, was becoming ever-more prolific. Still, observers did a double-take when the FBI announced that its lab in Quantico, Virginia, was able to break into the phone of Thomas Matthew Crooks, who tried to assassinate former President Trump on Saturday, in just two days. More than 2,000 law enforcement agencies in every state had access to such mobile device extraction tools as of 2020. The most effective of these tools cost between $15,000 and $30,000. It is likely, as with cell-site simulators that can spoof cellphones into giving up their data, that these phone-breaking tools are purchased by state and local law enforcement with federal grants. We noticed recently that Tech Dirt reported that for $100,000 you could have purchased a cell-site simulator of your very own on eBay. The model was old, vintage 2004, and is not likely to work well against contemporary phones. No telling what one could buy in a more sophisticated market. The takeaway is that the free market created encryption and privacy for customer safety, privacy, and convenience. The ingenuity of technologists responding to market demand from government agencies is now being used to tear down consumer encryption, one of their greatest achievements. PPSA today announced the filing of a lawsuit to compel the FBI to produce records about the possible use of FISA Section 702 authority – enacted by Congress to enable surveillance of foreign targets on foreign soil – for political surveillance of Americans at home.
Activists on the left and the right have long suspected the FBI uses surreptitious means to spy on lawful protests and speech. Those suspicions were confirmed when a FISA court decision released in 2022 revealed that government investigators had used Section 702 global database to surveil all 19,000 donors to a single Congressional campaign. Acting on this concern, PPSA submitted a FOIA request to the FBI in February seeking all records discussing the use of Section 702 or other FISA authorities to surveil, collect information related to, or otherwise investigate anyone who attended:
The FBI almost immediately responded to PPSA that our FOIA request “is not searchable” in the FBI’s “indices.” The response also informed us that the FBI “administratively closed” our request. The FBI did not dispute that PPSA’s FOIA request reasonably described the requested records. This should have, under the FOIA statute, triggered a search requirement, but the FBI ignored it. The self-serving excuse that limitations to the FBI’s Central Records System overlooks the plentiful databases and search methods at the fingertips of one of the world’s premier investigative organizations. After a fruitless appeal to the Department of Justice’s Office of Information Policy, exhausting any administrative remedy, PPSA is now suing in the U.S. District Court of the District of Columbia to compel the FBI to produce these documents. We’ll keep you informed of any major developments. We’ve long recounted the bad news on law enforcement’s use of facial recognition software – how it misidentifies people and labels them as criminals, particularly people of color. But there is good news on this subject for once: the Detroit Police Department has reached a settlement with a man falsely arrested on the basis of a bad match from facial recognition technology (FRT) that includes what many civil libertarians are hailing as a new national standard for police.
The list of injustices from false positives from FRT has grown in recent years. We told the story of Randall Reid, a Black man in Georgia, arrested for the theft of luxury goods in Louisiana. Even though Reid had never been to Louisiana, he was held in jail for a week. We told the story of Porchia Woodruff, a Detroit woman eight months pregnant, who was arrested in her driveway while her children cried. Her purported crime was – get this – a recent carjacking. Woodruff had to be rushed to the hospital after suffering contractions in her holding cell. Detroit had a particularly bad run of such misuses of facial recognition in criminal investigations. One of them was the arrest of Robert Williams in 2020 for the 2018 theft of five watches from a boutique store in which the thief was caught on a surveillance camera. Williams spent 30 hours in jail. Backed by the American Civil Liberties Union, the ACLU of Michigan, and the University of Michigan Civil Rights Litigation Initiative, Williams sued the police for wrongful arrest. In an agreement blessed by a federal court in Michigan, Williams received a generous settlement from the Detroit police. What is most important about this settlement agreement are the new rules Detroit has embraced. From now on:
Another series of reforms impose discipline on the way in which lineups of suspects or their images unfold. When witnesses perform lineup identifications, they may not be told that FRT was used as an investigative lead. Witnesses must report how confident they are about any identification. Officers showing images to a witness must themselves not know who the real suspect is, so they don’t mislead the witness with subtle, non-verbal clues. And photos of suspects must be shown one at a time, instead of showing all the photos at once – potentially leading a witness to select the one image that merely has the closest resemblance to the suspect. Perhaps most importantly, Detroit police officers will be trained on the proper uses of facial recognition and eyewitness identification. “The pipeline of ‘get a picture, slap it in a lineup’ will end,” Phil Mayor, a lawyer for the ACLU of Michigan told The New York Times. “This settlement moves the Detroit Police Department from being the best-documented misuser of facial recognition technology into a national leader in having guardrails in its use.” PPSA applauds the Detroit Police Department and ACLU for crafting standards that deserve to be adopted by police departments across the United States. State financial officials in 23 states have fired off a letter to House Speaker Mike Johnson expressing strong opposition to a new Security and Exchange Commission program that grants 3,000 government employees real-time access to every equity, option trade, and quote from every account of every broker by every investor.
“Traditionally, Americans’ financial holdings are kept between them and their broker, not them, their broker, and a massive government database,” the state auditors and treasurers wrote. “The only exception has been legal investigations with a warrant." The state financial officers contend that the SEC's move undermines the principles of federalism by imposing a one-size-fits-all solution without considering the unique regulatory environments of individual states. They asked Speaker Johnson to support a bill sponsored by Rep. Barry Loudermilk (R-GA), the Protecting Investors' Personally Identifiable Information Act. This proposed legislation would restrict the SEC's ability to collect and centralize such vast amounts of personal financial data. As is so common with recent efforts at financial surveillance, the SEC justifies this data collection to combat insider trading, market manipulation, and to identify suspicious activities. Similar excuses are offered for the new “beneficial ownership” requirement that is forcing millions of Americans who own small businesses to send the ownership details of their businesses to the Financial Crimes Enforcement Network (FinCEN) of the U.S. Treasury. But such increased vigilance comes at the expense of the privacy of millions of Americans. The sheer volume of data accessible to government employees raises concerns about potential misuse and unauthorized access. “The Securities and Exchange Commission has been barreling forward with a new system – the Consolidated Audit Trail (CAT) – which tracks every trade an individual investor makes and links it to their identity through a centralized system,” Rep. Loudermilk said. “Not only is collecting all this information unnecessary, regulators already have similar systems that don’t easily match identities with transactions, but it also creates another security vulnerability and a target for hackers.” While the SEC assures lawmakers that strict safeguards are in place – given recent high-profile hacks and All the more reason for Speaker Johnson to give Rep. Loudermilk’s bill a big push on the House floor. As the adoption of Automated License Plate Readers (ALPRs) creates ubiquitous surveillance of roads and highways, the uses and abuses of these systems – which capture and store license plate data – received fresh scrutiny by a Virginia court willing to question Supreme Court precedent.
In Norfolk, 172 such cameras were installed in 2023, generating data on just about every citizen’s movements available to Norfolk police and shared with law enforcement in neighboring jurisdictions. Enter Jayvon Antonio Bell, facing charges of robbery with a firearm. In addition to alleged incriminating statements, the key evidence against Bell includes photographs of his vehicle captured by Norfolk’s Flock ALPR system. Bell’s lawyers argued that the use of ALPR technology without a warrant violated Bell’s Fourth and Fourteenth Amendment rights, as well as several provisions of the Virginia Constitution. The Norfolk Circuit Court, in a landmark decision, granted Bell's motion to suppress the evidence obtained from the license plate reader. This ruling, rooted in constitutional protections, weighs in on the side of privacy in the national debate over data from roadway surveillance. The court was persuaded that constant surveillance and data retention by ALPRs creates, in the words of Bell’s defense attorneys, a “dragnet over the entire city.” This motion to dismiss evidence has the potential to reframe Fourth Amendment jurisprudence. The Norfolk court considered the implications of the Supreme Court opinion Katz v. United States (1967), which established that what a person knowingly exposes to the public is not protected by the Fourth Amendment. In its decision, the court boldly noted that technological advancements since Katz have expanded law enforcement's capabilities, making it necessary to re-evaluate consequences for Fourth Amendment protections. The court also referenced a Massachusetts case in which limited ALPR use was deemed not to violate the Fourth Amendment. The Norfolk Circuit Court’s approach was again pioneering. The court found that the extensive network of the 172 ALPR cameras in Norfolk, which far exceeded the limited surveillance in the Massachusetts case, posed unavoidable Fourth Amendment concerns. The Norfolk court also expressed concern about the lack of training requirements for officers accessing the system, and the ease with which neighboring jurisdictions could share data. Additionally, the court highlighted vulnerabilities in ALPR technology, citing research showing that these systems are susceptible to error and hacking. This is a bold decision by this state court, one that underscores the need for careful oversight and regulation of ALPR systems. As surveillance technology continues to evolve, this court’s decision to suppress evidence from a license plate reader is a sign that at least some judges are ready to draw a line around constitutional protections in the face of technological encroachment. Scholl and Bednarz v. Illinois State Police We recently reported on the proliferation of automated license plate readers (ALPRs) in Virginia. Now a lawsuit from two Cook County, Illinois, residents make a Fourth Amendment claim against the growing system of ALPRs. It directly sets out the dangers such systems pose to privacy and constitutional rights.
The suit by plaintiffs Stephanie Scholl and Frank Bednarz against the Illinois State Police highlights the proliferation of license plate readers to the point of near ubiquity – 300 ALPRs across every expressway in Cook County. Calling this “a system of dragnet surveillance,” the plaintiffs write that law enforcement is “tracking anyone who drives to work in Cook County – or to school, or a grocery store, or a doctor’s office, or a pharmacy, or a political rally, or a romantic encounter, or family gathering – every day, without any reason to suspect anyone of anything, and are holding onto those whereabouts just in case they decide in the future that some citizen might be an appropriate target of law enforcement.” As with so many surveillance systems, danger to privacy lies not just in the mere collection of data, but how long it is stored and when and how it is used. The plaintiffs write that when “law enforcement chooses to investigate a citizen’s past movements, the ALPRs feed databases creating a comprehensive map of their travels, recording every time they’ve driven past ISP’s cameras – and indeed every time they’ve driven past cameras in other jurisdictions using the same database.” The vendor for these devices, Vetted Security Solutions, which uses Motorola’s “Vigilant” system, feeds every detected license plate into Vigilant’s Law Enforcement Archival Reporting Network (LEARN) national database, which holds millions of license plate images that allow millions of Americans to be tracked. The good news is that the Illinois State Police only holds its license plate data for 90 days after it is collected. But this agency is not required by law or by Vigilant policy to do so. Every law enforcement customer is allowed to set their own retention limits – or none at all. The result is potentially years’ worth of data held by law enforcement agencies that track the movements of Americans around the country. Add to this all the data that our cars and GPS systems produce, in addition to all the commercial information that is purchased by federal and local agencies, and we begin to get a sense of the scale of warrantless surveillance of Americans. We should be grateful to Scholl and Bednarz for laying out in plain English the danger license plate readers can pose to Americans. This technology is one more tile being set into an enormous mosaic of capabilities, an emerging American panopticon. It is also one more reason to spark a national discussion on what data the government should collect, and the need for warrants to track Americans. Someone has to watch the watchers, and we can all do our part not to let the government gather such dangerous surveillance powers unnoticed and unchallenged. George Orwell wrote that in a time of deceit, telling the truth is a revolutionary act.
Revolutionary acts of truth-telling are becoming progressively more dangerous around the world. This is especially true as autocratic countries and weak democracies purchase AI software from China to weave together surveillance technology to comprehensively track individuals, following them as they meet acquaintances and share information. A piece by Abi Olvera posted by the Bulletin of Atomic Scientists describes this growing use of AI to surveil populations. Olvera reports that by 2019, 56 out of 176 countries were already using artificial intelligence to weave together surveillance data streams. These systems are increasingly being used to analyze the actions of crowds, track individuals across camera views, and pierce the use of masks or scramblers intended to disguise faces. The only impediment to effective use of this technology is the frequent Brazil-like incompetence of domestic intelligence agencies. Olvera writes: “Among other things, frail non-democratic governments can use AI-enabled monitoring to detect and track individuals and deter civil disobedience before it begins, thereby bolstering their authority. These systems offer cash-strapped autocracies and weak democracies the deterrent power of a police or military patrol without needing to pay for, or manage, a patrol force …” Olvera quotes AI surveillance expert Martin Beraja that AI can enable autocracies to “end up looking less violent because they have better technology for chilling unrest before it happens.” Olivia Solon of Bloomberg reports on the uses of biometric identifiers in Africa, which are regarded by the United Nations and World Bank as a quick and easy way to establish identities where licenses, passports, and other ID cards are hard to come by. But in Uganda, Solon reports, President Yoweri Museveni – in power for 40 years – is using this system to track his critics and political opponents of his rule. Used to catch criminals, biometrics is also being used to criminalize Ugandan dissidents and rival politicians for “misuse of social media” and sharing “malicious information.” The United States needs to lead by example. As our facial recognition and other systems grow in ubiquity, Congress and the states need to demonstrate our ability to impose limits on public surveillance, and legal guardrails for the uses of the sensitive information they generate. Every moral person agrees we must fight the sexual abuse of children online. But a renewed push by the Belgian Presidency within the European Union’s executive branch would force all consumers to accept software that would annihilate any semblance of communications privacy. This would be done with government technology that would break end-to-end encryption. (Hat tip to Joe Mullin of EFF.)
In the name of catching those who traffic in Child Sexual Abuse Materials (CSAM), the EU is poised to degrade the ability of anyone to privately communicate. Worse, it could enable illicit and dangerous surveillance by bad actors. The EU had previously proposed scanning the full content of encrypted messages. In what is being sold as a new approach, the executive branch is now offering a tweaked but still problematic approach called “upload moderation.” This proposal would mandate the scanning of hyperlinks and images within encrypted messages. In theory, consumers could refuse to consent to this snooping, but they would be blocked from sharing any further photos or videos. Such coerced consent is, of course, no consent at all. What is lost in this debate is that encryption is a major protector of personal security, human rights, and liberty. In an open letter to the EU, leading civil liberties organizations – including the Center for Democracy & Technology, Mozilla, and the Electronic Frontier Foundation – warn policymakers that such technology would be dangerous “bugs in our pockets.” Such “client-side scanning” pushes surveillance beyond what is shared on the cloud directly to the user’s device. Some trolls already threaten journalists by sending them unwanted CSAM. Dictatorships could use Europe’s system to send innocuous images to dissidents that contain the correct parameters to trigger a CSAM alarm – and then use the results of that alarm to locate that person. Cartels and other criminal gangs could use it to locate witnesses. Experts demonstrate that malevolent agents can manipulate the hash database of such a system to transform it into a risk for physically locating and surveilling individuals. Victims around the world could ironically include women and children hiding in safe houses from abusers and stalkers. CSAM users are despicable criminals who deserve to be ferreted out and punished. But creating a system that eradicates all privacy in electronic communications is not the solution. In the early 1920s revenue agents staked out a South Carolina home the agents suspected was being used as a distribution center for moonshine whiskey. The revenue agents were in luck. They saw a visitor arrive to receive a bottle from someone inside the house. The agents moved in. The son of the home’s owner, a man named Hester, realized that he was about to be arrested and sprinted with the bottle to a nearby car, picked up a gallon jug, and ran into an open field.
One of the agents fired a shot into the air, prompting Hester to toss the jug, which shattered. Hester then threw the bottle in the open field. Officers found a large fragment of the broken jug and the discarded bottle both contained moonshine whiskey. This was solid proof that moonshine was being sold. But was it admissible as evidence? After all, the revenue agents did not have a warrant. This case eventually wound its way to the Supreme Court. In 1924, a unanimous Court, presided over by Chief Justice (and former U.S. President) William Howard Taft, held that the Fourth Amendment did not apply to this evidence. Justice Oliver Wendell Holmes, writing the Court’s opinion, declared that “the special protection accorded by the Fourth Amendment to the people in their ‘persons, houses, papers and effects,’ is not extended to the open field.” This principle was later extended to exclude any garbage that a person throws away from Fourth Amendment protections. As strange as it may seem, this case about broken jugs and moonshine from the 1920s, Hester v. United States, provides the principle by which law enforcement officers freely help themselves to the information inside a discarded or lost cellphone – text messages, emails, bank records, phone calls, and images. We reported a case in 2022 in which a Virginia man was convicted of crimes based on police inspection of a cellphone he had left behind in a restaurant. That man’s attorney, Brandon Boxler, told the Daily Press of Newport News that “cellphones are different. They have massive storage capabilities. A search of a cellphone involves a much deeper invasion of privacy. The depth and breadth of personal and private information they contain was unimaginable in 1924.” In Riley v. California, the Supreme Court in 2018 upheld that a warrant was required to inspect the contents of a suspect’s cellphone. But the Hester rule still applies to discarded and lost phones. They are still subject to what Justice Holmes called the rules of the open field. The American Civil Liberties Union, ACLU Oregon, the Electronic Privacy Information Center, and other civil liberties organizations are challenging this doctrine before the Ninth Circuit in Hunt v. United States. They told the court that it should not use the same reasoning that has historically applied to garbage left out for collection and items discarded in a hotel wastepaper basket. “Our cell phones provide access to information comparable in quantity and breadth to what police might glean from a thorough search of a house,” ACLU said in a posted statement. “Unlike a house, though, a cell phone is relatively easy to lose. You carry it with you almost all the time. It can fall between seat cushions or slip out of a loose pocket. You might leave it at the check-out desk after making a purchase or forget it on the bus as you hasten to make your stop … It would be absurd to suggest that a person intends to open up their house for unrestrained searches by police whenever they drop their house key.” Yet that is the government position on lost and discarded cellphones. PPSA applauds and supports the ACLU and its partners for taking a strong stand on cellphone privacy. The logic of extending special protections to cellphones, which the Supreme Court has held contain the “privacies of life,” is obvious. It is the government’s position that tastes like something cooked up in a still. State of Alaska v. McKelveyWe recently reported that the Michigan Supreme Court punted on the Fourth Amendment implications in a case involving local government’s warrantless surveillance of a couple’s property with drone cameras. This was a disappointing outcome, one in which we had filed an amicus brief on behalf of the couple.
But other states are taking a harder look at privacy and aerial surveillance. In another recent case, the Alaska Supreme Court in State v. McKelvey upheld an appeals court ruling that the police needed to obtain a warrant before using an aircraft with officers armed with telephoto lenses to see if a man was cultivating marijuana in his backyard at his home near Fairbanks. In a well-reasoned opinion, Alaska’s top court found that this practice was “corrosive to Alaskans’ sense of security.” The state government had argued that the observations did not violate any reasonable expectation of privacy because they were made with commercially available, commonly used equipment. “This point is not persuasive,” the Alaska justices responded. “The commercial availability of a piece of technology is not an appropriate measure of whether the technology’s use by the government to surveil violates a reasonable expectation of privacy.” The court’s reasoning is profound and of national significance: “If it is not a search when the police make observations using technology that is commercially available, then the constitutional protection against unreasonable searches will shrink as technology advances … As the Seventh Circuit recently observed, that approach creates a ‘precarious circularity.’ Adoption of new technologies means ‘society’s expectations of privacy will change as citizens increasingly rely on and expect these new technologies.’” That is as succinct a description of the current state of privacy as any we’ve heard. The court found that “few of us anticipated, when we began shopping for things online, that we would receive advertisements for car seats and burp cloths before telling anyone there was a baby on the way.” We would add that virtually no one in the early era of social media anticipated that federal agencies would use it to purchase our most intimate and sensitive information from data brokers without warrants. The Alaska Supreme Court sees the danger of technology expansion with drones, which it held is corrosive to Alaskans’ sense of privacy. As we warned, drones are becoming ever cheaper, sold with combined sensor packages that can be not only deeply intrusive across a property, but actually able to penetrate into the interior of a home. The Alaska opinion is an eloquent warning that when it comes to the loss of privacy, we’ve become the proverbial frog, allowing ourselves to become comfortable with being boiled by degrees. This opinion deserves to be nationally recognized as a bold declaration against the trend of ever-more expanding technology and ever-more shrinking zones of privacy. Katie King in the Virginian-Pilot reports an in-depth account about the growing dependency of local law enforcement agencies on Flock Safety cameras, mounted on roads and intersections to catch drivers suspected of crimes. With more than 5,000 police agencies across the nation using these devices, the privacy implications are enormous.
Surveillance cameras have been in the news at lot lately, often in a positive light. Local news is consumed by murder suspects and porch pirates alike captured on video. The recently released video of a physical attack by rapper Sean “Diddy” Combs on a girlfriend several years ago has saturated media, reminding us that surveillance can protect the vulnerable. The crime-solving potential of license plate readers is huge. Flock’s software runs license plate numbers through law enforcement databases, allowing police to quickly track a stolen car, locate suspects fleeing a crime, or find a missing person. With such technologies, Silver and Amber alerts might one day become obsolete. As with facial recognition technology, however, license plate readers can produce false positives, ensnaring innocent people in the criminal justice system. King recounts the ordeal of an Ohio man who was arrested by police with drawn guns and a snarling dog. Flock’s license plate reader had falsely flagged his vehicle as having stolen tags. The good news is that Flock insists it is not even considering combining its network with facial recognition technology – reducing the possibility of both technologies flagging someone as dangerous. As with so many surveillance technologies, the greater issue in license-plate readers is not the technology itself, but how it might be used in a network. “There’s a simple principle that we’ve always had in this country, which is that the government doesn’t get to watch everybody all the time just in case somebody commits a crime – the United States is not China,” Jay Stanley, a senior analyst with the American Civil Liberties Union, told King. “But these cameras are being deployed with such density that it’s like GPS-tracking everyone.” License plate readers could, conceivably, be networked to track everywhere that everyone goes – from trips to mental health clinics, to gun stores, to houses of worship, and protests. With so many federal agencies already purchasing Americans’ sensitive data from data brokers, creating a national network of drivers’ whereabouts is just one more addition to what is already becoming a national surveillance system. With apologies to Jay Stanley, we are in serious danger of becoming China. As massive databases compile facial recognition, location data, and now driving routes, we need more than ever to head off the combination of all these measures. A good place to start would be for the U.S. Senate follow the example of the House by passing the Fourth Amendment Is Not For Sale Act. The City of Denver is reversing its previous stance against the use of police drones. The city is now buying drones to explore the effectiveness of replacing many police calls with remote aerial responses. A Denver police spokesman said that on many calls the police department will send drones first, officers second. When operators of drones see that a call was a false alarm, or that a traffic issue has been resolved, the police department will be free to devote scarce resources to more urgent priorities.
Nearby Arapahoe County already has a fleet of 20 such drones operated by 14 pilots. Arapahoe has successfully used drones to follow suspects fleeing a crime, provide live-streamed video and mapping of a tense situation before law enforcement arrives, and to look for missing people. In Loveland, Colorado, a drone was used to deliver a defibrillator to a patient before paramedics were able to get to the scene. The use of drones by local law enforcement as supplements to patrol officers is likely to grow. And why not? It makes sense for a drone to scout out a traffic accident or a crime scene for police. But as law enforcement builds more robust fleets of drones, they could be used not just to assess the seriousness of a 911 call, but to provide the basis for around-the-clock surveillance. Modern drones can deliver intimate surveillance that is more invasive than traditional searches. They can be packed with cell-simulator devices to extract location and other data from cellphones in a given area. They can loiter over a home or peek in someone’s window. They can see in the dark. They can track people and their activities through walls by their heat signatures. Two or more cameras combined can work in stereo to create 3D maps inside homes. Sensor fusion between high definition, fully maneuverable cameras can put all these together to essentially give police an inside look at a target’s life. Drones with such high-tech surveillance packages can be had on the market for around $6,000. As with so many other forms of surveillance, the modest use of this technology sounds sensible, until one considers how many other ways they can be used. Local leaders at the very least need to enact policies that put guardrails on these practices before we learn, the hard way, how drones and the data they generate can be misused. A report by The New York Time’s Vivian Wang in Beijing and one by Tech Policy’s Marwa Sayed in New York describes the twin strategies for surveilling a nation’s population, in the United States as well as in China.
Wang chronicles the move by China’s dictator, Xi Jinping, to round out the pervasive social media and facial recognition surveillance capability of the state by bringing back Mao-era human snitching. Wang writes that Xi wants local surveillance that is “more visible, more invasive, always on the lookout for real or perceived threats. Officers patrol apartment buildings listening for feuding neighbors. Officials recruit retirees playing chess outdoors as extra eyes and ears. In the workplace, employers are required to appoint ‘safety consultants’ who report regularly to the police.” Xi, Wang reports, explicitly links this new emphasis on human domestic surveillance to the era when “the party encouraged residents to ‘re-educate’ purported political enemies, through so-called struggle sessions where people were publicly insulted and humiliated …” Creating a society of snitches supports the vast network of social media surveillance, in which every “improper” message or text can be reviewed and flagged by AI. Chinese citizens are already followed everywhere by location beacons and a national network of surveillance cameras and facial recognition technology. Marwa Sayed writes about the strategy of technology surveillance contained in several bills in New York State. One bill in the state legislature would force the owners of driver-for-hire vehicles to install rear-facing cameras in their cars, presumably capturing private conversations by passengers. Another state bill would mandate surveillance cameras at racetracks to monitor human and equine traffic, watching over people in their leisure time. “Legislators seem to have decided that the cure to what ails us is a veritable panopticon of cameras that spares no one and reaches further and further into our private lives,” Sayed writes. She notes another measure before the New York City Council that would require the Department of Sanitation to install surveillance cameras to counter the insidious threat of people putting household trash into public litter baskets. Sayed writes: “As the ubiquity of cameras grows, so do the harms. Research shows that surveillance and the feeling it creates of constantly being watched leads to anxiety and paranoia. People may start to feel there is no point to personal privacy because you’ll be watched wherever you go. It makes us wary about taking risks and dampens our ability to interact with one another as social creatures.” Without quite meaning to, federal, state, and local authorities are merging the elements of a national surveillance system. This system draws on agencies’ purchases of our sensitive, personal information from data brokers, as well as increasingly integrated camera, facial recognition, and other surveillance networks. And don’t think that organized human snitching can’t come to these shores either. During World War One, the federal government authorized approved citizens to join neighborhood watch groups with badges inscribed with the words, “American Protection League – Secret Service.” At a time when Americans were sent to prison for opposing the war, the American Protection League kept tabs on neighbors, always on the watch out for anyone who seemed insufficiently enthusiastic about the war. Americans could be reported to the Department of Justice for listening to Beethoven on their phonographs or checking out books about German culture from the library. Today, large numbers of FBI and other government employees secretly “suggest” that social media companies remove posts that contain “disinformation.” They monitor social media to track posts of people, whether targeted by the FBI as traditional Catholics or observant Muslims, for signs of extremism. As world tension grows between the United States and China, Russia, Iran and North Korea, something like the American Protection League might be resurrected soon in response to a foreign policy crisis. Its digital ghost is already watching us. The House of Representatives on Thursday passed the CBDC Anti-Surveillance State Act, 216-192, a measure sponsored by House Majority Whip Tom Emmer (R-MN) that would prohibit the Federal Reserve from issuing a central bank digital currency (CBDC) that would give the federal government the ability to monitor and control individual Americans’ spending habits.
“A digital dollar could give the FBI and other federal agencies instant, warrantless access to every transaction of any size made between Americans,” said Bob Goodlatte, former congressman and PPSA Senior Policy Advisor. “This would be an alarming and unacceptable invasion of our Fourth Amendment right to privacy. The CBDC Anti-Surveillance State Act takes a critical step to prevent this from happening. We applaud Rep. Emmer for his leadership in protecting Americans against pervasive government surveillance of our financial data.” Perhaps next the House will consider measures to rein in financial surveillance by the U.S. Treasury and the Financial Crimes Enforcement Network (FinCEN). Passage by the House of the CBDC Anti-Surveillance State Act is an encouraging sign that more Members and their constituents are learning about the government’s financial surveillance and are ready to push back. Suspect: “We Have to Follow the Law. Why Don’t They?" Facial recognition software is useful but fallible. It often leads to wrongful arrests, especially given the software’s tendency to produce false positives for people of color.
We reported in 2023 on the case of Randall Reid, a Black man in Georgia, arrested and held for a week by police for allegedly stealing $10,000 of Chanel and Louis Vuitton handbags in Louisiana. Reid was traveling to a Thanksgiving dinner near Atlanta with his mother when he was arrested. He was three states and seven hours away from the scene of this crime in a state in which he had never set foot. Then there is the case of Portia Woodruff, a 32-year-old Black woman, who was arrested in her driveway for a recent carjacking and robbery. She was eight months pregnant at the time, far from the profile of the carjacker. She suffered great emotional distress and suffered spasms and contractions while in jail. Some jurisdictions have reacted to the spotty nature of facial recognition by requiring every purported “match” to be evaluated by a large team to reduce human bias. Other jurisdictions, from Boston to Austin and San Francisco, responded to the technology’s flaws by banning the use of this technology altogether. The Washington Post’s Douglas MacMillan reports that officers of the Austin Police Department have developed a neat workaround for the ban. Austin police asked law enforcement in the nearby town of Leander to conduct face searches for them at least 13 times since Austin enacted its ban. Tyrell Johnson, a 20-year-old man who is a suspect in a robbery case due to a facial recognition workaround by Austin police told MacMillan, “We have to follow the law. Why don’t they?” Other jurisdictions are accused of working around bans by posting “be on the lookout” flyers in other jurisdictions, which critics say is meant to be picked up and run through facial recognition systems by other police departments or law enforcement agencies. MacMillian’s interviews with defense lawyers, prosecutors, and judges revealed the core problem with the use of this technology – employing facial recognition to generate leads but not evidence. They told him that prosecutors are not required in most jurisdictions to inform criminal defendants they were identified using an algorithm. This highlights the larger problem with high-tech surveillance in all its forms: improperly accessed data, reviewed without a warrant, can allow investigators to work backwards to incriminate a suspect. Many criminal defendants never discover the original “evidence” that led to their prosecution, and thus can never challenge the basis for their case. This “backdoor search loophole” is the greater risk, whether one is dealing with databases of mass internet communications or facial recognition. Thanks to this loophole, Americans can be accused of crimes but left in the dark about how the cases against them were started. The federal government’s hunger for financial surveillance is boundless. A central bank digital currency (CBDC) would completely satisfy it. Under a CBDC, all transactions would be recorded, giving federal agencies the means to review any Americans’ income and expenditures at a glance. Financial privacy would not be compromised: it would be dead.
Federal Reserve Chairman Jerome Powell says this country is “nowhere near” establishing a digital currency. To be sure, such an undertaking would take years. But Nigeria, Jamaica, and the Bahamas already have digital currencies. China is well along in a pilot program for a digital yuan. The U.S. government is actively exploring this as an option. It is not too early to consider the consequences of a digital dollar. Such a digital currency would create a presumably unbreakable code, or “blocks” linked together by cryptographic algorithms, to connect computers to create a digital ledger to record transactions. Some risks of a CBDC are obvious – from the breaking of “unbreakable” codes by criminals and hostile foreign governments, to the temptation for Washington, D.C., to expand the currency with a few clicks, making it all the easier to inflate the currency. House Majority Whip, Rep. Tom Emmer (R-MN), is especially concerned about the privacy implications of a digital currency. “If not designed to be open, permissionless, and private – emulating cash – a government-issued CBDC is nothing more than a CCP-style (Chinese Communist Party) surveillance tool that would be used to undermine the American way of life,” Rep. Emmer said. He is expected to soon reintroduce a bill that would require any central bank digital currency to require authorizing legislation from Congress before it could be enacted. Emmer’s stand is prescient, not premature. From the new requirement for “beneficial ownership” forms by small businesses, to the revelation from House hearings of warrantless, dragnet surveillance through credit card and ATM transactions, the federal government is inventing new ways to track our every financial move. Rep. Emmer is right to head this one off at the pass. PPSA endorses this bill and urges Emmer’s colleagues to pass it into law. A new fiat currency should have the permission of Congress and the American people. The long back-and-forth between Michigan’s Long Lake Township and Todd and Heather Maxon ended with the Michigan Supreme Court punting on the Fourth Amendment implications of drone surveillance over private property.
An appellate court had held that the township’s warrantless use of a drone three times in 2017 to photograph the Todd’s property was an unreasonable, warrantless search, constituting a Fourth Amendment violation. PPSA filed a brief supporting the Maxons before the Michigan Supreme Court, alerting the court to the danger of intimate searches of home and residents by relatively inexpensive drones now on the market. To demonstrate the privacy threat of drones, PPSA informed the court that commercially available drones have thermal cameras that can penetrate beyond what is visible to the naked eye. They can be equipped with animal herd tracking algorithms that can enhance the surveillance of people. Drones can swarm and loiter, providing round-the-clock surveillance. They can carry lightweight cell-site simulators that prompt the mobile phones of people inside the targeted home to give up data that reveals deeply personal information. Furthermore, PPSA’s brief states that drones “can see around walls, see in the dark, track people by heat signatures, and recognize and track specific people by their face.” PPSA agreed that even ordinary photography from a camera hovering over the Maxon’s property violated, in the words of an appellate court, the Maxon’s reasonable expectation of privacy. But in a unanimous decision, Michigan’s top court was having none of this. It concluded that the exclusionary rule – a judicial doctrine in which evidence is excluded or suppressed – is generally applied when law enforcement violates a defendant’s constitutional rights in a criminal case. The justices remanded the case based upon a procedural issue unrelated to the Fourth Amendment question. The Michigan Supreme Court, therefore, declined to address “whether the use of an aerial drone under the circumstances presented here is an unreasonable search in violation of the United States or Michigan Constitutions.” A crestfallen Todd Maxon responded, “Like every American, I have a right to be secure on my property without being watched by a government drone.” The issue between the township and the Maxons was the contention that, behind the shelter of trees, the couple was growing a salvage operation. This violated an earlier settlement agreement the Maxons had made pledging not to keep a junkyard on their five-acre property. Given the potential for drones to use imaging and sensor technology to violate the intimate lives of families, it is all but inevitable that a better – and uglier – test case will come along. If anything, this ruling makes it a virtual certainty. The Federal Government’s “Beneficial Ownership” Snoop Millions of small business owners are about to be hit with a nasty surprise. The Corporate Transparency Act, which passed Congress as part of the must-pass National Defense Authorization Act of 2021, goes into effect this year. Advertised as a way to combat money laundering, this new law now requires small businesses to report their “beneficial owners” to the U.S. Department of Treasury’s Financial Crimes Enforcement Network (FinCEN).
This reporting requirement falls on any small business with fewer than 20 employees to reveal its “beneficial owner.” In plain English, this means a small business must give the government the name of anyone who controls or has a 25 percent or greater interest in that business. By Jan. 1, 2025, small businesses must submit the full legal name, date of birth, current residential or business address, and a unique identifier from a government ID of all its beneficial owners. There are significant privacy risks at stake in this seemingly innocuous law, beginning with the widespread access multiple federal agencies will have to this new database. This law, which covers 32 million existing companies and will suck in an additional 5 million new companies every year, threatens anyone who makes a mistake or files an incomplete submission with up to $10,000 in fines and up to two years in prison. “The CTA will potentially make a felon out of any unsuspecting person who is simply trying to make a living in his or her own lawful business or who is trying to start one and makes a simple mistake for violations,” says the National Small Business Association (NSBA). The “beneficial ownership” provision is one more way for the federal government to break down the walls of financial privacy in its quest to comprehensively track Americans’ finances. Consider another big bill, the recent Infrastructure Investment and Jobs Act of 2021, which requires $10,000 or more in cryptocurrency transactions to be reported to the government within 15 days. Incorrect or missing information may result in a $25,000 fine or five years in prison. In addition, the CATO Institute reports that new regulations under consideration would hold financial advisors accountable to “elements of the Bank Secrecy Act, which currently compels banks to turn over certain financial data to the feds.” It is likely that your financial advisor will soon be required to snitch on you. This undermines the whole concept of a fiduciary, someone who is by law supposed to be loyal to your interests. All of these measures are justified by the quest to track the money networks of criminals, terrorists, and drug dealers. But the data these authorities generate will be available, without a warrant, to the IRS, the FBI, the ATF, the Department of Homeland Security, and just about any agency that wants to investigate you for your personal activities or statements that some official deems suspicious. The CTA’s “beneficial ownership” provision represents a new assertion by the federal government over small business. Since before the Constitution, the regulation of small business has been under the purview of the states. Now Washington is assembling a database with which it can heap new regulations on small business regardless of state policies. The NSBA, which is challenging this law in court, estimates that complexities in business ownership will require companies to spend an average of $8,000 a year to comply with this law. NSBA’s lawsuit is moving forward with a named plaintiff, Huntsville business owner Isaac Winkles, in a federal lawsuit. NSBA and Winkles won summary judgment from Judge Liles Burke of the U.S. District Court of the Northern District of Alabama, who held the beneficial owner requirement to be unconstitutional because it exceeds the enumerated powers of Congress. While the government appeals its case to the Eleventh Circuit, FinCEN maintains that it will only exclude small businesses from this requirement if they were members of NSBA on or before March 1. These encroachments are steady and their champions on the Hill are growing bolder in financial surveillance. The good news is that privacy activists have just acquired 32 million new allies. Well, that didn’t take long.
A little more than three weeks ago Congress reauthorized FISA Section 702, a surveillance program enacted to authorize foreign surveillance but which is often used by the FBI to snoop on Americans’ communications caught up in the NSA’s global data trawl. Central to that debate was whether 702 should be made to conform to the Fourth Amendment’s bar against unreasonable searches. The House and Senate fiercely debated late into the night over whether to reauthorize this flawed program. Supporters said it is vital to national security. Critics said that is no excuse for the FBI using Section 702 to surveil large numbers of Americans in recent years, including sitting Members of the House and Senate, journalists, politicians, a state judge, and 19,000 donors to a Congressional campaign. In the House that debate culminated in a 212 to 212 tie vote. That’s how close advocates of privacy and freedom for law-abiding citizens from warrantless government surveillance came to victory. The intelligence establishment and its champions on Capitol Hill won many votes with promises. They included in their bill a codification of a list of new internal FBI procedures that they promised would curb any abuses of Americans’ privacy. FBI Director Christopher Wray promised that agents would be “good stewards” who would protect the homeland “while safeguarding civil rights and liberties.” On April 19, the Senate finalized the reauthorization of Section 702 and sent it to President Biden to be signed into law. On April 20, FBI deputy director Paul Abbate emailed Bureau employees, stating: “To continue to demonstrate why tools like this [Section 702] are essential, we need to use them, while also holding ourselves accountable for doing so properly and in compliance with legal requirements.” He added, “I urge everyone to continue to look for ways to appropriately use US person queries to advance the mission …” Wired, which obtained a copy of the memo, quoted Rep. Zoe Lofgren (D-CA), who said that Deputy Director Abbate’s email directly contradicted earlier assertions from the FBI made during the debate over Section 702’s reauthorization. “The deputy director’s email seems to show that the FBI is actively pushing for more surveillance of Americans, not out of necessity but as a default,” Rep. Lofgren said. The FBI reports it has drawn down the number of such U.S. person queries from about 3 million in 2021 to 57,094 in 2023. As Wired notes, however, the FBI methodology counts multiple accessing of Americans’ personal identifier, such as phone numbers, as just a single search. As Wired reports, the FBI’s proud assertion that its compliance rate of 98 percent with its more stringent rules would still leave it with more than 1,000 violations of its own policies. With the deputy director arrogantly pushing the Bureau to make greater use of Section 702 for the warrantless surveillance of Americans, we can only wonder what the numbers of U.S. person searches will be in the next few years. Whatever happens, the more than 150 civil liberties organizations, including PPSA, will be back when Section 702 is next up for reauthorization in less than two years. The Constitution’s protections of the people cannot be ignored. |
Categories
All
|