Maxon v. Long Lake Township If someone sets a tall ladder against your fence, and leans out into your yard to take photographs, would that be a trespass? A court would surely affirm common sense and say that’s a trespass – even if the offender is merely leaning into the space over your property, touching nothing.
What if a nosy neighbor hoisted a Go-Pro on a long camera stick over your fence? Again, that would be a trespass. But what if you drew a line from the core of the Earth, through your backyard, to a point in outer space 280 miles above the planet? Is all that aerial space above your backyard protected? This is important because the “trespass test” is essential to how courts determine if government surveillance should or should not require a probable cause warrant to inspect a citizen’s property. Such questions emerge from the comments of an Associate Justice of the Michigan Supreme Court during oral argument last week in Long Lake Township v. Maxon. This case centers around whether local government should have obtained a probable cause warrant to send a drone to surveil a five-acre estate for the civil offense of collecting prohibited scrap. Is a drone more like the nosy neighbor or more like the camera on an airplane, or – given advances in technology – the sensors of a Google Earth satellite? The counsel for the township told the court: “Google has a tool where you can even draw, if you want to know whether it’s 50 feet from this house to this barn, or 100 feet from this house to this barn. You do that right on the Google satellite imagery. And so given the reality of the world we live in, how can there be a reasonable expectation of privacy in aerial observations of property?” The government’s argument seems to be that technology is so advanced that privacy is dead. And if privacy is dead, then should we scrap the Fourth Amendment as a quaint relic of the Eighteenth Century? Maxon’s counsel held fast to the idea that Google Earth cannot yet perform the kind of invasive, sensory-rich surveillance that a drone can do. He also noted that drones, limited by the FAA to fly under 400 feet, are necessarily low altitude. One Justice reacted to the assertion that if Google Earth could map a backyard as closely and intimately as a drone, that would be a search. “Technology is rapidly changing,” the Justice responded. “I don’t think it is hard to predict that eventually Google Earth will have that capacity.” U.S. Supreme Court case law has held that ordinary photographs from fixed wing aircraft flying into publicly navigable airspace, or from helicopters, do not violate the Fourth Amendment and thus do not require a warrant. Not so, however, for more advanced technology. For example, the Baltimore Police Department flew a plane with military technology developed for occupied Baghdad to take pervasive snapshots of Baltimoreans and their movements across 30-square miles. This technology is extraordinarily robust, able to record and track the movement of thousands of individuals and cars across a whole day. A federal court recognized that such super-sharp, comprehensive imagery necessarily invokes constitutional issues. An ACLU lawsuit against this war-zone surveillance of Americans resulted in the Fourth Circuit Court of Appeals finding the practice to be unconstitutional. Last week’s Michigan oral argument will likely be seen as another great step forward in the debate over aerial surveillance. At first, the discussion centered around altitude – as if the 400-foot limit of a drone made its closeness (like the nosy neighbor) the decisive factor. But the exchange points to the conclusion that the truly decisive factor is not altitude, but the level of intrusiveness of a given technology. How much information can a drone equipped with facial recognition, heat sensing, and other superhuman sensory capabilities glean from an overflight? Enough, we say, to qualify as a trespass requiring a probable cause warrant. In fact, such drones could gather even more information than an individual physically inspecting a property. Given that the U.S. military and the CIA already use satellite imagery to identify and follow individuals, it is not a stretch to say that Google Earth or something like it will soon have an ability to pierce the privacy of any domicile or anyone who walks outside. But that does not make such an invasion reasonable or destroy legitimate expectations of privacy. And if such a degree of intrusion into someone’s privacy – whether from a plane equipped with war-zone surveillance technology, or a Google Earth camera with slightly futuristic capabilities – then that, too, would constitute a trespass by Google requiring a probable cause warrant. The law already distinguishes between the incidental path of a passenger airplane and a deep search, like that of the Baltimore police aircraft. The same principle should apply to intrusive private conduct. A watcher who is at a sufficient altitude above an actual physical presence could still be considered a trespasser of sorts when peering into someone’s backyard at a level of detail impossible for a passenger on an ordinary overflight. That evolving technology allows intrusive invasions over greater distances does not negate any “reasonable expectations of privacy” by citizens – it just illustrates growing violations of those expectations. The Michigan court seemed to be alert to these dangers. Chief Justice Elizabeth T. Clement referred to the reasoning in PPSA’s amicus brief asking if the Supreme Court reversed itself to ultimately uphold the “exclusionary rule” discounting evidence that violates the Fourth Amendment. However Maxon is decided, this case will likely be remembered for logically leading to the idea that in aerial surveillance the Fourth Amendment is invoked by the degree of intrusion, not mere altitude. California Gov. Gavin Newsom signed into law SB 362 – also known as the Delete Act – establishing even more robust online privacy protections in a state already at the vanguard of digital rights.
The Delete Act’s most noteworthy provision establishes a “one-stop-shop” for data removal – essentially an “off-switch” for consumers to request the scrubbing of all their collected online data. The Delete Act requires the creation of this single-point, no-cost deletion mechanism by Jan. 1, 2026. All registered data brokers, in turn, will have to access that website every 45 days to address consumer requests and remove collected data when asked to do so. Under prior law, consumers found it difficult to communicate with around 500 data brokers doing business in California. The Delete Act will require data brokers to register with the California Privacy Protection Agency (CPPA). It creates a “do not track” list similar to the National Do Not Call Registry. And it enshrines new transparency requirements for brokers, who must now disclose the collection of sensitive information such as precise geolocation data, reproductive health care data, and personal data collected from minors. This landmark legislation follows on the heels of the California Consumer Privacy Act and the California Privacy Rights Act, which together form the backbone of one of the most protective digital rights regimes in the world. Yet data collection in California has continued unabated in recent years despite these protections, due in large part to the difficulties in opting out. Consumers find there are simply too many players scraping public records, social media profiles and online transactions. These players create digital profiles from our most sensitive personal information and sell it to corporations, advertisers, governments, and law enforcement agencies for the purposes of analyzing, predicting and even shaping our behavior. In this regard, the Delete Act’s one-stop mechanism empowers consumers to take control of their data and free themselves from online manipulation (not to mention government’s warrantless snooping, a flagrant Fourth Amendment violation). We applaud the California Legislature, sponsor Sen. Josh Becker and Gov. Newsom for taking a bold step in the direction of consumer privacy. We have to note, however, that California is often criticized for its sweeping, at times inartful approach to business regulation. The Delete Act faces similar concerns. Some critics call it a “sledgehammer approach” with unpredictable ramifications for businesses and consumers. According to one poll, more than 80 percent of California’s residents support the Delete Act. If all those millions opt-out, it’s a game-changer for the way online business is conducted in the epicenter of tech culture and innovation. Small businesses may find it particularly difficult to acquire new customers, while non-profits could have a tougher time finding donors. The new law authorizes CPPA to fine-tune the rules to make it practical. In the meantime, California deserves applause for enhancing digital privacy. It’s a watershed moment, and rest assured other states – and nations – will be watching closely as this new paradigm takes shape in the coming years. Congress passed a mandate in 2021 that will require all new cars sold later in this decade to have a built-in drunk driver detection system. This law, well-intentioned as it may be, is fraught with enormous risks to the privacy of any American who drives a car.
The vague goal this mandate sets out is: If your car thinks you’re overserved, your car won’t start. Or perhaps it will pull over and call the police. It is not clear, exactly, how this technology will work. In any event, this law promises to make every car a patrol car, with you inside it. Rep. Thomas Massie (R-KY), a long-time defender of civil liberties, is not having it. He is proposing an amendment to the Transportation, Housing and Urban Development (yes, the Washington acronym here is THUD) appropriations bill to safeguard Americans’ constitutional right to privacy by forbidding federal expenditures to implement this ill-conceived mandate. PPSA is proud to support this amendment and we stand together with other supporters, including FreedomWorks and the Due Process Institute. While aggressive action to curb impaired driving is appropriate, the privacy issues raised by Rep. Massie about the mandate for this “advanced drunk driving and impaired driving prevention technology” are impossible to ignore. They are ultimately of great consequence to the future of our country. First, consider that this technology will monitor the driving performance of millions of Americans who don’t drink and drive, potentially keeping many of them from operating their vehicles. While many states allow for court-mandated ignition interlock devices for people convicted of DUIs (requiring people under such an order to clear a self-administered breathalyzer test before their cars will start), these state restrictions are far more reasonably tailored than the broader and more intrusive federal mandate. Crucially, they make the necessary distinction between the irresponsible few who are under a court order, and the responsible many who are not. Additionally, the state regulations do not passively monitor drivers’ performance. What do the responsible many have to lose under the federal mandate? The driver detection mandate could violate your privacy and constitutional rights on a massive scale. Consider: Absent a breathalyzer, this technology might well – like some commercial delivery operators already do – use a camera and AI to passively monitor your body movements for signs of impairment. Moreover, would your video data be stored? And if it is stored, would camera data follow you and any passengers in the car – perhaps with a sound recording of anything that you might say to each other? (After all, analyzing voice data could be used by AI to look for the possible slurring of your words.) And if this video and/or voice data is stored, would these videos then be part of the enormous stream of data that federal agencies – from the IRS, to the FBI, to the DHS – now routinely purchase and access without a warrant? (This brings to mind an old joke: An FBI agent walks into a bar. The bartender says, “I’ve got a joke for you.” The agent replies, “heard it!”) Video analytics technology, like facial recognition software, is hardly foolproof. Would this yet-to-be-developed device read people with disabilities as being intoxicated? Would perfectly sober people register false positives and not be able to drive? Rep. Massie’s amendment would provide a much-needed sobriety check on the government’s foolhardy leap into mandating this technology. PPSA strongly urges Congress to pass the Massie amendment and protect the privacy and constitutional rights of millions of Americans. An FBI raid on the home of a Tampa-based journalist, and the seizure of his computer, hard drives, cellphone and all they contain, is raising questions about the fidelity of the Department of Justice to a year-old revision to its News Media Policy announced by Attorney General Merrick Garland. Under that policy, the Department is forbidden from using compulsory legal processes to obtain the newsgathering records of journalists, except in extreme circumstances.
Now a wide spectrum of press freedom and civil liberties organizations, including PPSA, are asking the Department of Justice to provide transparency about this FBI raid in May. The FBI executed its search warrant at the home of journalist Tim Burke, which he shares with his wife, Tampa city councilwoman Lynn Hurtak. The credibility of this extreme action is highly questionable, leaving the Department to explain how this ransacking of a journalist’s home and seizure of his devices differs from the now-widely ridiculed police raid on a newspaper in rural Kansas. Your phone, like your dog, knows all about you. But your dog will never tell. Your smartphone does, all day long, producing data that the federal government can buy and access without a warrant.
The same, increasingly, is true of your car. It knows where you go, and for how long. For example, Tesla has internal cameras, and according to Elon Musk biographer, Walter Isaacson, that CEO wanted them to record drivers to defend the company against lawsuits in the event of an accident. As your car integrates with your smartphone, the automobile becomes just another digital device that tracks your every move. A contemporary car can accumulate 4,000 gigabytes of data every day. Our cars’ entertainment and communications systems track our address books, call logs and what we listen to. Systems made to monitor performance can report our weight, as well as where we’ve driven, and if we’ve driven there alone or with someone else. But at least your dog in the backseat still won’t rat you out. This is just one more way digital technology is narrowing the bounds of privacy to, essentially, floatation tanks. The good news is that lawmakers in the Bay State are reacting to defend the privacy of their constituents. Two bills, one introduced in the Massachusetts House and one in the Senate, would limit collected data, set rules for the security of that data, and require it to be purged after it becomes irrelevant. Moreover, data collection would require the consent of the owner. Jalopnik.com reports that privacy advocates, however, are finding loopholes in the law “wide enough to drive a Nissan through.” Whatever the strength of these bills, Protect The 1st commends Massachusetts lawmakers for thinking around the technological curve while that very technology hurtles us ever faster, ever forward. As with AI, a sense of urgency for predictive rulemaking is in order. There was a time when talking cars were a staple of science fiction. Now our cars tell us where to go and when to turn – and sometimes won’t shut up. What our cars will do next we may not be able to quite imagine. Massachusetts has started a debate that needs to go national and in high gear. An Example of American Techno-Masochism PPSA works hard to counter growing government surveillance. This generally means surveillance by U.S. federal agencies – such as FISA’s Section 702 authority passed by Congress for foreign surveillance but used to spy on Americans. We also scrutinize expanding surveillance by state and local police, including cell-site simulators that trick your smartphone into giving up your location and other information, and ubiquitous facial recognition software that can follow you around.
But our concerns about government surveillance don’t end with just our government. We are increasingly concerned about the regular and sometimes pervasive surveillance of Americans by the People’s Republic of China, most recently the potential for Beijing to use TikTok as a way to track 80 million Americans. Now, thanks to an investigative piece in The Free Press, we’ve learned that China is also looking to surveil Americans through an increasingly common technology in American cars – LIDAR, or Light Detection and Ranging. This is the system that allows self-driving and semiautonomous cars to track the traffic around them. LIDAR is also, The Free Press reports, “a mapping technology, an aid to the growing number of smart cities, a tool for robotics, farming, meteorology, you name it.” Who is the dominant manufacturer and seller of LIDAR technology in the United States? It is Hesai, a Chinese company that sells nearly one out of every two LIDAR systems globally. In sales, it far outsells all of its American competitors together. China is relying on an old playbook to dominate the U.S. and world markets in LIDAR. The Free Press reports that Hesai does this by offering a solid product, but one backed by Chinese subsidies to sell at below price. Why would they do that? An explanation comes from Sen. Ted Budd (R-NC), who fired off a letter earlier this summer to the Assistant Secretary of Defense for Industrial Base Policy. “[I]t is my understanding that the Chinese LIDAR companies are working with the Chinese Government and the People’s Liberation Army (PLA) to improve this technology and leverage it for Chinese military applications. Simultaneously, these companies have been flooding the U.S. market with low-cost, heavily subsidized Chinese LIDAR, potentially enabling the Chinese to collect a trove of valuable information … “Moreover, the Chinese Government is using LIDAR sensors to conduct police surveillance in the Xinjiang Uyghur Autonomous Region, where evidence suggests China is engaged in ongoing genocide of the Uyghur people.” Given that Chinese law enforces a “military-civil fusion” strategy on Chinese businesses, requiring every Chinese organization and citizen to “support, assist, and cooperate with the state intelligence work,” why on earth would we allow that same government to be able to spy on every American in every near-future car? It is one thing to be forced into the position of the Uyghurs. It is quite something else for the United States to willingly submit to techno-masochism. Montana governor Greg Gianforte recently signed into law SB 351, the Genetic Information Privacy Act. It’s the latest in Montana’s concerted effort to protect its citizens’ privacy interests in the face of evolving threats from emerging technologies.
Montana was in the vanguard of digital privacy protection in 2013 when it passed HB 603, requiring judicial authorization before law enforcement is permitted to access location data. This was a full five years before Carpenter v. United States, in which the U.S. Supreme Court recognized that warrantless access to such information violates the Fourth Amendment. Since that time, Montana’s has passed into law:
(Hat tip to Jennifer Lynch of the Electronic Frontier Foundation for a good breakdown of this decade-long trajectory.) Montana also passed a 2021 constitutional amendment with sweeping support that added electronic data and communications to the state’s search and seizure protections. (Montana’s recent ban on TikTok resulted in some privacy benefits but significantly more consternation from some in the media.) This year’s Genetic Information Privacy Act is one of the most robust genetic privacy laws of its kind, reinforcing the 2021 law and creating consent requirements for genetic data processing and for all subsequent uses of that data. It requires a “high-level privacy policy overview” from companies for all new customers. And, absent consent, it strictly prohibits the transfer of genetic data to employers or health and life insurance companies. The statutory definition of genetic data, meanwhile, has been broadened to include “self-reported health information” and otherwise plug gaps in potential uses of consumer information. To date, Washington has failed to pass much in the way of meaningful digital privacy legislation (or genetic privacy legislation outside of a non-discrimination bill in 2008 and piecemeal HIPAA protections). The United States has a number of older information privacy laws related to specific sectors such as health care and finance, and they’ve been used to prevent certain harms. In short, the federal government broadly allows the collection of personal data, then subsequently regulates certain industries that use that data. It’s a reactive – rather than proactive – way to address a growing threat to privacy. As a result, individual states are stepping in to act on privacy in the digital age. It’s encouraging to see a bipartisan array of state legislatures do so: California, Connecticut, Colorado, Indiana, Iowa, Montana, Tennessee, Virginia, and Utah have already passed or enacted comprehensive data privacy laws. As for genetic privacy, other states should consider following Montana’s lead. As digitization of medical records becomes standardized and genetic sequencing becomes easier, authorities and private actors have ever more avenues for accessing your information. Whether it’s warrantless law enforcement searches of DNA databases, use of medical information by private companies for employment and insurance purposes or even the private patenting of human genes – the threats are manifold – and scary. Our (cowboy) hats are off to Montana for its forward-thinking efforts to safeguard our rights in the face of rapidly advancing technology. While many of us were grilling hot dogs and hamburgers, the line between sci-fi dystopia and reality got a little blurrier. The New York City Police Department announced it was using aerial drones to “check in” on parties held across the city over the Labor Day weekend.
The NYPD is making the move, it says, in response to complaints about large and noisy parties during the holiday weekend. At a press conference, Assistant NYPD Commissioner Kaz Daughtry said: “If a caller states there’s a large crowd, a large party in a backyard, we’re going to be utilizing our assets to go up and go check on the party.” The practice of aerial surveillance is escalating. New York police used drones just four times in 2022 but have so far used them 124 times in 2023. Mayor Eric Adams has said he wants to see police further embrace the “endless” potential of drones. The decision is almost certainly illegal. Daniel Schwarz, a privacy and technology strategist at the New York Civil Liberties Union, says mass drone surveillance may violate the city’s Public Oversight of Surveillance Technology (POST) Act. This is an ordinance passed in 2020 that requires the NYPD to disclose its surveillance tactics. The proliferation of drones over our backyards, however, may not be unconstitutional. U.S. Supreme Court precedent on the Fourth Amendment has dealt with aerial surveillance before. In the 1988 case Florida v. Riley, the Court held that Florida did not violate a man’s right against unreasonable searches when police, on a tip, flew a helicopter over his property and observed a greenhouse in which the man was growing marijuana. The greenhouse was not visible from the ground and could only be detected aerially. But nearly 40 years have passed since Florida v. Riley, and in that time police departments across the country have been able to amass and deploy an entire fleet of small, flexible aerial drones. Whereas police might have been constrained by the cost to own and operate a helicopter in the past, today’s police departments can operate a sizable drone fleet at a fraction of the price, enabling a near permanent aerial surveillance force. Further compounding the problem is the high degree of reciprocity between local law enforcement and the national security center. A Department of Justice response to a PPSA Freedom of Information Act request shows that local governments have received fleets of drones and other surveillance technology from the federal government. As Washington floods local police forces with hovering spies, it is time for cities and states to update our laws and jurisprudence on aerial surveillance. State legislatures are passing age-verification laws that require users to upload driver’s licenses or passports to view pornographic material. This is well-meaning – and arguably necessary – legislation to protect children from viewing hardcore pornography online. Such a solution, however, has a drawback that needs to be addressed in legislative language. It leaves the door open for potentially catastrophic data privacy breaches – not to mention granting the FBI and other government agencies immense power, in the words of a declassified government report, to “facilitate blackmail, stalking, harassment, and public shaming.”
In 2022, Louisiana passed HB 142, holding porn sites liable for failing to “perform reasonable age verification methods.” The bill sailed through the legislature with bipartisan support. Since then, six states have passed similar laws. Sixteen others have introduced them. Pornhub responded with suits against Louisiana and Utah, and has ceased doing business altogether in Arkansas, Mississippi, Utah, and Virginia. Today, if you visit Pornhub from an IP address in one of those states, the only thing you’ll see is a video message from porn star Cherie DeVille explaining why you can’t see her with her clothes off. DeVille’s message is a simplified version of arguments made by the Free Speech Coalition, a porn industry advocacy and trade group. One of the solutions offered by that group is to verify age by device. It would be child’s play, however, for hackers and government(s) to deanonymize IP addresses. Whether we adopt either age-verification solution – those of the legislators or those of the porn industry – a risk is created that hackers and the FBI can exploit adult’s private browsing histories. It’s not like there’s no appetite for government to use personal information. Documents obtained through a Freedom of Information Act request show that the Defense Intelligence Agency uses commercially available data for “cover operations.” The FBI has a team dedicated to parsing cell tower data. A multitude of federal, state, and local law enforcement – as well as intelligence agencies – regularly purchase vast troves of personal information from data brokers, and then warrantlessly search that data in flagrant violation of the Fourth Amendment. You’ll forgive us for not expecting government restraint when it is presented with an Aladdin’s Cave of mortifying search histories. Imagine, for example, a bystander in a white-collar crime investigation who gets a visit from an FBI agent seeking his cooperation as a wire-wearing, confidential informant. “By the way,” the agent says in passing, “this is neither here nor there, but I happened to notice that you frequent a website that makes creative uses of My Little Pony. Wouldn’t want that to get out, now would we?” It is likely that more legislators in more states will act out of the belief that hardcore porn seen by children is a crisis that needs to be addressed. Lawmakers should keep in mind, however, the need to include privacy measures in such legislation. One place to start would be a blanket restriction of any sale of browsing data, or warrantless access to it by government agencies. Or perhaps the sites could delete the data once approval is granted. We’re not sure what the best solution would look like, but we’ll know it when we see it. PPSA previously commented on a New York Times scoop in April that revealed a contractor for the U.S. government had purchased and used a spy tool from NSO, the Israeli firm that developed and released Pegasus software into the wild – which can turn smartphones into pervasive surveillance tools.
The White House was surprised that its own government did business with NSO a few days after the administration had put that firm on the no-business “Entity List.” NSO was placed on this blacklist because its products, the U.S. Commerce Department declared, “developed and supplied spyware to foreign governments that used these tools to maliciously target government officials, journalists, businesspeople, activists, academics, and embassy workers.” Understandably upset, the White House tasked the FBI to sleuth out who in the government might have violated the blacklist and used the software. Mark Mazzetti, Ronen Bergman, and Adam Goldman of The Times report that months later the FBI has come back with a definitive identification of this administration’s scofflaw. The FBI followed the breadcrumbs and discovered, you guessed it, that it was the FBI. Fortunately, the FBI did not purchase the “zero-day” spyware Pegasus, but another spy tool called Landmark, which pings the cellphones of suspects to track their movements. The FBI says it used the tool to hunt fugitives in Mexico. It also claims that the middleman, Riva Networks of New Jersey, had misled the FBI about the origins of Landmark. Director Christopher Wray discontinued this contract when it came to light. Meanwhile, The Times reports that two sources revealed that contrary to the FBI’s assertions, cellphone numbers were targeted in Mexico in 2021, 2022, and into 2023, far longer than the FBI says Landmark was used. We should not overlook the benefits of such FBI investigations. In fact, PPSA has a tip to offer. We suggest that the FBI track down the government bureau that has been routinely violating the U.S. Constitution by conducting backdoor searches with FISA Section 702 material, as well as warrantlessly surveilling Americans purchased data. More to follow. The unanimous passage of the Fourth Amendment Is Not for Sale Act by the House Judiciary Committee, as well as the expiration of Section 702 of the Foreign Intelligence Surveillance Act, is spurring the National Security Agency into a furious lobbying campaign of the public and Congress to stop surveillance reform.
NSA lobbyists argue that it would be hobbled by the House measure, which would require agencies to obtain a probable cause warrant before purchasing Americans’ private data. Former intelligence community leaders are also making public statements, arguing that passage of Section 702 of the Foreign Intelligence Surveillance Act (FISA) with any meaningful changes or reforms would simply be too dangerous. George Croner, former NSA lawyer, is one of the most active advocates of the government’s “nothing to see here, folks” position. In March, Croner portrayed proposals for a full warrant requirement as a new and radical idea. He quoted two writers that concern over warrantless, backdoor searches is a concern of “panicky civil libertarians” and right-wing conspiracy theorists. In a piece this week, Croner co-authored a broadside against the ACLU’s analysis of the NSA’s and FBI’s mass surveillance. For example, Croner asserts that civil liberties critics are severely undercounting great progress the FBI has made in in reducing U.S. person queries, a process in which agents use the names, addresses, or telephone numbers of Americans to extract their private communications. Croner celebrates a 96 percent reduction in such queries in 2022 as a result of process improvements within the FBI. But, to paraphrase the late, great Henny Youngman, 96 percent of what? Ninety-six percent of a trillion data points? A quadrillion? The government’s numbers are murky and ever-changing, but the remaining amount appears, at the very least if you take these numbers at face value, to constitute well over 200,000 warrantless searches of Americans. Elizabeth Goitein of the Brennan Center for Justice, who has placed her third installment in a series on Section 702 in the online outlet Just Security – a masterclass on that program and why it must be reformed – has her own responses to Croner. While Croner portrays a warrant requirement for reviewing Americans’ data as a dangerous proposal, Goitein sees such a requirement as way to curb “backdoor searches,” and return to the guarantees of the Fourth Amendment. Goitein writes: “For nearly a decade, advocates, experts, and lawmakers have coalesced around a backdoor search solution that would require a warrant for all U.S. person queries conducted by any U.S. agency. Indeed, some broadly supported proposals have gone even further and restricted the type of information the government could obtain even with a warrant.” She describes a Review Group on Intelligence and Communications Technologies that included many, like former CIA acting director Michael J. Morrell, who are anything but panicky civil libertarians. This group nevertheless found it responsible to recommend warrants “based on probable cause” before surveilling a United States person. Other supporters of probable cause warrants range from Rep. Thomas Massie (R-KY) and Zoe Lofgren (D-CA), to Sens. Dianne Feinstein (D-CA), Mike Lee (R-UT), and former Sen. Kamala Harris (D-CA). They all saw what Goitein describes: “Without such a measure, Section 702 will continue to serve as an end-run around the protections of the Fourth Amendment and FISA, and the worst abuses of the power to conduct U.S. queries will continue.” We eagerly await ACLU’s response to Croner’s critique. Such debates, online and perhaps in person, are the only way to winnow out who is being candid and who is being too clever by half. It is a healthy development for intelligence and civil libertarian communities to debate their clashing views before the American people and the Congress rather than leave the whole discussion to secret briefings on Capitol Hill. Does the Fifth Amendment privilege against self-incrimination prevent the government from forcing a defendant to unlock their cellphone? That’s the question at issue in People v. Sneed, a recent case brought before the Illinois Supreme Court, which found in favor of the state.
This ruling is a blow to Fifth Amendment protections in the digital age and an interpretation that cannot be sustained if we are to properly extend constitutional protections to ever-evolving technology. In an amicus brief before the court, the American Civil Liberties Union aptly laid out the arguments against compelling passwords from the accused. Fifth Amendment protections against self-incrimination, they point out, derive from the founders’ fears of an American “Star Chamber,” the English judicial body that became synonymous with oppressive interrogation tactics and a lack of due process. Drawing on this foundation, the American legal system has largely supported the notion that “the State cannot compel a suspect to assist in his own prosecution through recall and use of information that exists only in his mind.” To do so would impose a “cruel trilemma” on a defendant who would face an impossible choice: perjury, self-incrimination, or contempt of court. As the ACLU points out, numerous high courts (including Indiana and Pennsylvania) have found that password disclosure constitutes testimony because it draws from “the contents of one’s mind.” Yet courts in New Jersey and Massachusetts have sided with Illinois, presenting a significant conflict of law in the ongoing effort to adapt constitutional precepts to our changing society. In finding for the state and forcing the defendant, Sneed, to unlock his cellphone, the Illinois Supreme Court drew on a somewhat obscure legal exception to the Fifth Amendment right against self-incriminating testimony known as the “foregone conclusion” doctrine. That exception, which the Supreme Court of the United States has applied only once before, holds that producing a password is not testimonial when the government can show, with reasonable particularity, that it already has knowledge of the evidence it seeks, that the evidence was under control of the defendant, and that the evidence is authentic. The idea is that the act of producing a password has little testimonial value in and of itself. The court misapplied that doctrine here, placing the focus on the password rather than the contents of Sneed’s cellphone. The court drew on precedents that probable cause justifies the intrusion: “Any information that may be found on the phone after it is unlocked is irrelevant, and we conclude that the proper focus is on the passcode.” But probable cause does not constitute evidentiary certainty. And, in applying its analysis to passcodes rather than the contents of a safe or lockbox or cellphone, the court ignores that the Supreme Court of the United States’ use of this exception in Fisher v. United States (1976) depended on a specific, narrow set of facts. There, the analysis focused on the production of business documents already proven to exist – not on a passcode. Allowing the “foregone conclusion” exception to apply to testimonial production of cellphone passwords opens the door to forcible government snooping across the vast scope of our digital lives. Gaining access to someone’s cellphone can reveal anything and everything about that person – including the most intimate details of a life. As the ACLU put it: “Locked phones and laptops may impose obstacles to law enforcement in particular cases. So do window shades. It is sometimes true that constitutional protections interfere with law enforcement investigations.” Until the Supreme Court of the United States resolves this issue, our Fifth Amendment rights in the digital age remain in doubt. PPSA’s Gene Schaerr Appeals to Congress to Assert Its Authority to Protect Americans’ Privacy and the Fourth AmendmentEnd the “Game of Surveillance Whack-a-Mole" Gene Schaerr, PPSA general counsel, in testimony before a House subcommittee on Friday, urged Congress to assert its prerogative to interpret Americans’ privacy and Fourth Amendment rights against the federal government’s lawless surveillance.
Schaerr said the reauthorization of a major surveillance law this year is a priceless opportunity for Congress to enact many long-needed surveillance reforms. There is, Schaerr told the Members of the House Judiciary Subcommittee on Crime and Government Surveillance, no reason for Congress to defer on such a vital, national concern to the judiciary. Congress also needs to assert its authority with executive branch agencies, he said. For decades, when Congress reforms a surveillance law, federal agencies simply move on to other legal authorities or theories to develop new ways to violate Americans’ privacy in “a game of surveillance whack-a-mole.” Schaerr said: “As the People’s agents, you can stop this game of surveillance whack-a-mole. You can do that by asserting your constitutional authority against an executive branch that, under both parties, is too often overbearing – and against a judicial branch that too often gives the executive an undeserved benefit of the doubt. Please don’t let this once-in-a-generation opportunity slip away.” Schaerr was joined by other civil liberties experts who described the breadth of surveillance abuse by the federal government. Liza Goitein of the Brennan Center for Justice at NYU Law School said that FISA’s Section 702 – crafted by Congress to enable foreign surveillance – has instead become a “rich source of warrantless access to Americans’ communications.” She described a strange loophole in the law that allows our most sensitive and personal information to be sold to the government. The law prevents social media companies from selling Americans’ personal data to the government, but it does not preclude those same companies from selling Americans’ data to third-party data brokers – who in turn sell this personal information to the government. Federal agencies assert that no warrant is required when they freely delve into such purchased digital communications, location histories, and browsing records. Goitein called this nothing less than the “laundering” of Americans’ personal information by federal agencies looking to get around the law. “We’re a nation of chumps,” said famed legal scholar and commentator Jonathan Turley of the George Washington University Law School, for accepting “massive violations” of our privacy rights. He dismissed the FBI’s recent boasts that it had reduced the number of improper queries into Americans’ private information, likening that boast to “a bank robber saying we’re hitting smaller banks.” Many members on both sides of the aisle echoed the concerns raised by Schaerr and other witnesses during the testimony. Commentary from the committee indicates that Congress is receptive to privacy-oriented reforms. Gene Schaerr cautioned that Congress should pursue such a strategy of inserting strong reforms and guardrails into Section 702, rather than simply allowing this authority to lapse when it expires in December. Drawing on his experience as a White House counsel, Schaerr said the “executive branch loves a vacuum.” Without the statutory limits and reporting requirements of Section 702, the FBI and other government agencies would turn to other programs, such as purchased data and an executive order known as 12333, that operate in the shadows. Despite this parade of horribles, the hearing had a cheerful moment when it was interrupted by the announcement of a major reform coalition victory. The Davidson-Jacobs Amendment passed the House by a voice vote during a recess in the hearing, an announcement that drew cheers from witnesses and House Members alike. This measure would require agencies within the Department of Defense to get a probable cause warrant, court order, or subpoena to purchase personal information that in other circumstances would require such a warrant. Schaerr was optimistic that further reforms will come. He said: “Revulsion at unwarranted government surveillance runs deep in our DNA as a nation; indeed, it was one of the main factors that led to our revolt against British rule and, later, to our Bill of Rights. And today, based on a host of discussions with many civil liberties and other advocacy groups, I’m confident you will find wide support across the ideological spectrum for a broad surveillance reform bill that goes well beyond Section 702.” Last month, we wrote about a surprisingly frank report from the Office of the Director of National Intelligence admitting the government’s increasing role in utilizing Commercially Available Information about United States citizens for investigative purposes. Despite the Supreme Court’s ruling in Carpenter v. United States, which held that a warrant is required before the government can seize location history from cell-site records, the report candidly reveals that the bulk collection of Americans’ private data continues unabated. Now, the Commonwealth of Massachusetts is taking steps to ban the purchase and sale of location data altogether. It’s a blunt solution to a complex issue, and a bellwether for where this debate might be headed.
“Location data” refers to information about the geographic locations of mobile devices like smartphones or tablets. When collected, this data can be used for relatively benign purposes like marketing – but also to identify the movements of individuals and discern their identities (a 2013 study found that only four spatio-temporal data points are required to identify someone in most circumstances). A host of companies collect this information, package it, and sell it to private actors like advertisers – and, increasingly, law enforcement agencies. The government can learn a lot about you based on your movements – and they know it. For example, the FBI has its own team dedicated to analyzing cell tower data. A growing number of states are now taking action to protect the digital privacy of their residents. Laws passed in California and Virginia require the affirmative consent of consumers before geolocation data can be used for specified purposes. The European Union has gone further, prohibiting the use of sensitive data by default unless a company can demonstrate that its use falls under a specifically enumerated exemption. In the United States, Massachusetts’ Location Shield Act (H.357|S.148) is by far the most comprehensive effort yet to protect our data from unwarranted (or warrantless) snooping. The bill’s drafters couch it within a social policy framework; it’s described as “An Act protecting reproductive health access, LGBTQ lives, religious liberty, and freedom of movement by banning the sale of cell phone location information.” Such concerns are not unfounded. As the ACLU writes, “In the aftermath of the Supreme Court’s Dobbs decision…journalists found that data brokers have continued to buy, repackage and sell the location information of people visiting sensitive locations including abortion clinics. This puts people who seek or provide care in our state at risk of prosecution and harassment, creating a vulnerability in our state’s post-Roe protections.” Beyond addressing those concerns, however, the bill does a lot to broadly reinforce our Fourth Amendment rights against unreasonable searches and seizures, implementing a warrant requirement for any law enforcement access to location data. Such restrictions would clear away some of the murk surrounding this issue in the wake of the Carpenter case, which required a warrant when accessing location data from phone companies, but which holds limited relevance when such data are readily available for commercial purchase. (Obviously, the same legal reasoning should apply.) Americans are waking up to the dangers of the $16 billion data brokerage industry. In Massachusetts, 92% of survey respondents said the government should enshrine stronger protections for consumer data – all the way back in 2017. Whether this bill makes it over the finish line or not, it’s a clear sign that Americans want comprehensive data privacy reform. And Massachusetts’ solution is one we’ll readily share. In today’s House Committee Judiciary hearing with FBI Director Christopher Wray, Rep. Pramila Jayapal (D-WA) expertly revealed the extent to which the FBI is unwilling to publicly discuss its use of commercially available information (go to 1:10:50 mark).
Rep. Jayapal asked the director about his claim before the Senate Intelligence Committee in March that the FBI had previously purchased Americans’ location data information from internet advertisers but had stopped the practice. Why, then, Jayapal asked, did a report from the Office of the Director of National Intelligence (ODNI) reveal that the government continues to purchase Americans’ personal data scraped from apps and sold to the government by third-party data brokers? The report was surprising for its frankness. An ODNI panel admitted that such data can be used to “facilitate blackmail, stalking, harassment, and public shaming.” Rep. Jayapal asked how the FBI uses such data. Director Wray responded that this is too complex to cover in a short exchange. He said there are so many precise definitions that he had best send “subject matter experts” from the FBI to give Rep. Jayapal a briefing, presumably behind closed doors and under classified rules that would prevent public discussion. Rep. Jayapal then went on to note that more than historic location data is at stake. Purchased data, she said, include biometric data, medical and mental health records, personal communications, and internet search histories and activities. She asked Director Wray: Does the FBI have a written policy on how it uses such commercially available information? Director Wray did not seem sure. He replied that he would be happy to provide a private briefing. Rep. Jayapal next asked if there is an FBI policy for using purchased information against Americans in criminal cases. Once again, Director Wray punted. After Rep. Jayapal was finished, House Judiciary Chair Jim Jordan (R-OH), said that her remarks were “well said,” and promised a bipartisan approach on the issue. Speaking for Republicans, Chairman Jordan told Rep. Jayapal, chair of the progressive caucus, “you have friends over here who want to help you with that.” We suggest that a bipartisan next step could be an open hearing with the FBI’s experts on how much purchased information is obtained and how it is used. Technology presents new challenges in the protection of Fourth Amendment rights, especially regarding expectations of privacy and warrantless searches. A key question, as the U.S. Supreme Court found in 2001, is how to preserve “that degree of privacy against government that existed when the Fourth Amendment was adopted.”
A recent case out of Maryland goes a long way in enshrining critical protections for personal data in that state, striking a bold contrast with other recent decisions that degrade privacy. The Fourth Amendment to the United States Constitution guarantees the “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures…” [emphasis added] Since the advent of the digital age, courts around the country have analogized personal data to the founding-era concept of personal papers, which the Supreme Court has long held to be safeguarded against unwanted intrusion. Yet, the Court has gone further in recent years, finding that digital information implicates “privacy concerns far beyond those implicated by the search of a cigarette pack, a wallet, or a purse.” After all, any thorough search of a citizen’s digital data is bound to turn up troves of personal information – from banking information to private correspondence – unrelated to the particularities of a warrant. It makes sense that our digital footprints would merit enhanced protection. Despite this seeming clarity, much ambiguity persists surrounding the protection of digital data. State of Maryland v. Daniel Ashley McDonnell presents a novel question of law and fact unresolved at the national level: Does a reasonable expectation of privacy exist in the contents of a copied computer hard drive after consent to search that hard drive has been revoked? The Supreme Court of Maryland found that it does. Here’s the background: Daniel Ashley McDonnell granted investigating officers’ consent to search his computer. Police subsequently made a forensic copy of McDonnell’s hard drive and proceeded to analyze its data even after McDonnell withdrew his consent. For its part, the state argued that McDonnell lost any reasonable expectation of privacy once he allowed his data to be copied. In constitutional law, warrantless searches of person or property are considered unreasonable unless certain exceptions apply – if a person lacks any reasonable expectation of privacy, for example, or if consent is granted to perform a search. Generally, when that consent is revoked, authorities may not conduct a search by relying on the prior consent. In this case, the Supreme Court of Maryland found that McDonnell had a privacy interest in his data itself – not in the hard drive copy made by investigating authorities. Had the police examined the hard drive data while consent was in effect, McDonnell would have lost any reasonable expectation of privacy in that data. But given that consent was withdrawn prior to the search, he maintained that expectation absent an independent search justification. The court wrote: “To accept the State’s stance--i.e., that Mr. McDonnell irrevocably lost all privacy interest in the data on his hard drive when he allowed [police] to copy it—would be to permit a limitless search through vast quantities and a varied array of personal data that the Supreme Court of the United States has characterized as consisting of more information than would be found in an exhaustive search of a person’s home.” In a similar case, the U.S. District Court for the Middle District of Florida came to a different conclusion, finding that “revocation of consent does not require the suppression of evidence already lawfully obtained.” However, the preponderance of case law and legal scholarship suggests the Supreme Court of Maryland struck the right balance. Its opinion is consistent with recent scholarship by law professor Orin Kerr, who argues that “the same Fourth Amendment rules that apply to searching a suspect’s computer should also apply to searching the government’s copy.” It is further consistent with the U.S. Supreme Court’s warnings in Riley v. California, which noted the potential of a cell phone search to reveal “[t]he sum of an individual’s private life.” The Supreme Court of Maryland’s decision is a win for data privacy. To quote the amicus brief from our friends at Restore the Fourth, absent such protections the government could “copy and indefinitely detain every private paper on a person’s hard drive (i.e., millions of documents) at minimal cost—except to the Fourth Amendment.” The Maryland decision was well reasoned and well done. The digital trail you leave behind can be used to create a profile of you by your race, religion, gender, sexual orientation, financial issues, personal medical history, mental health, and your physical location.
PPSA has long warned against the routine sale of our personal and sensitive information scraped from apps and sold to U.S. federal agencies by data brokers. The general counsels of these law enforcement and intelligence agencies claim that they are not violating the Fourth Amendment prohibition against warrantless search and seizure because they are not seizing our data at all. They’re just buying it. That is galling enough, but what about hostile governments accessing your most personal information? They have no guardrails and would surely have no scruples in using your information against you and, for those in the military or other sensitive positions, the United States. Under Chinese law, China’s technology companies are obligated to share their data with Chinese intelligence. Imagine all the data Chinese military, intelligence, and commercial actors have on the 80 million American users of TikTok. Then multiply that by all the data China acquires through legal, commercial means. “Massive pools of Americans’ sensitive information – everything from where we go, to what we buy and what kind of health care services we receive – are for sale to buyers in China, Russia and nearly anyone with a credit card,” said Sen. Ron Wyden, (D-OR), sponsor of the Protecting Americans’ Data from Foreign Surveillance Act of 2023. “The privacy and security of our data is essential to the freedoms we hold dear,” said co-sponsor Sen. Cynthia Lummis (R-WY). “If foreign adversaries can access our data, they can control it.” Their bill is also supported in the Senate by Sens. Sheldon Whitehouse (D-RI), Bill Hagerty (R-TN), Martin Heinrich (D-NM), and Marco Rubio (R-FL). It is supported in the House by Rep. Warren Davidson (R-OH) and Rep. Anna Eshoo (D-CA). This bill would apply tough criminal and civil penalties to prevent employees of foreign corporations like TikTok from accessing U.S. data from abroad. “Freedom surrendered is rarely reclaimed,” said Rep. Davidson. PPSA agrees and supports this bill. “The need to address foreign exploitation of Americans’ data is urgent,” said Bob Goodlatte, former House Judiciary Committee Chairman and Senior Policy Advisor to PPSA. “This legislation should also prompt us to get our own house in order. Members should address exploitation of our personal information by our government. I hope every member who signs on to this bill supports requiring the U.S. government to obtain a warrant when it wishes to inspect our commercially acquired information, as well as data from Section 702 of the Foreign Intelligence Surveillance Act.” “What Would You Like the Power to Do?”When you use an ATM while out of town, you probably don’t expect your bank to report your transactions and location to the FBI. But that is exactly what Bank of America did to an unknown but undoubtedly large number of customers who used their credit or debit cards in Washington, D.C., from Jan. 5 to Jan. 7, 2021.
PPSA heartily agrees with the prosecution of those who planned and executed the ransacking of the U.S. Capitol on Jan. 6 and beat Capitol Hill police officers senseless. But it does us no good to uphold the inviolability of the Capitol and the constitutional process for electing a president if we jettison the Constitution by illicitly surveilling large numbers of innocent Americans as potential suspects. House Judiciary Chairman Jim Jordan (R-OH) and subcommittee chairman Thomas Massie (R-KY) brought this incident to light when they announced an investigation of Bank of America, which compiled mass information on bank users only to “voluntarily and without any legal process” gift it to the FBI. Bank of America’s slogan, “What would you like the power to do?” seems to be an open invitation to the FBI to snoop. “This information undoubtedly included private details about Bank of America customers who had nothing at all to do with the events of January 6,” FBI whistleblower George Hill testified before Congress. “Even worse, BoA provided information about Americans who exercised their Second Amendment right to purchase a firearm.” The FBI has had a duty to investigate the terrible events of Jan. 6. But it doesn’t have the right to obtain mass, bulk customer information from private entities. Looking beyond this issue, the greater danger is that the FBI, like a dozen other federal agencies, can simply purchase much of our consumer information from third-party data brokers who sell our private information scraped from apps. The easy coordination of Bank of America with the FBI also begs for greater transparency for the FBI’s backdoor access to customer data from other corporations, especially social media companies. The demand of the two congressmen for Bank of America’s internal communications on this collection and for the bank’s communications with the FBI ought to shed light on the nature of their collaboration. What precipitated this curious gift of customer data to the FBI? The FBI has a duty to investigate. When it does, and when it wants access to Americans’ private communications, this duty necessarily requires warrants, as the Constitution requires. Credit to the Department of Justice for a voluminous response to our Freedom of Information Act (FOIA) request. Our request concerned the use of stingrays, or cell-site simulators, by that department and its agencies. Out of more than 1,000 pages in DOJ’s response, we’ve found a few gems. Perhaps you can find your own.
Review our digest of this document here, and the source document here. The original FOIA request concerned DOJ policies on cell-site simulators, commonly known by the commercial brand name “stingrays.” These devices mimic cell towers to extract location and other highly personal information from your smartphone. The DOJ FOIA response shows that the FBI in 2021 invested $16.1 million in these cell-site simulators (p. 209) in part to ensure they “are capable of operating against evolving wireless communications.” The bureau also asked for $13 million for “communications intercept resources.” This includes support for the Sensitive Investigations Unit’s work in El Salvador (p. 111). On the policy side, we’ve reported that some federal agencies, such as the Bureau of Alcohol, Tobacco, Firearms and Explosives, maintain that stingrays are not GPS location identifiers for people with cellphones. This is technically true. Stingrays do not download location data or function as GPS locators. But this is too clever by half. Included in this release is an Obama-era statement by former Department of Justice official Sally Yates that undermines this federal claim by stating: “Law enforcement agents can use cell-site simulators to help locate cellular devices whose unique identifiers are known …” (p. 17) This release gives an idea of how versatile stingrays have become. The U.S. Marshals Service (p. 977) reveals that it operates cell-site simulators and passive wireless collection sensors to specifically locate devices inside multi-dwelling buildings. Other details sprinkled throughout this release concern other, more exotic forms of domestic surveillance. For example, the U.S. Marshals Service Service has access to seven aircraft located around the country armed with “a unique combination of USMS ELSUR suite, high resolution video surveillance capability … proven to be the most successful law enforcement package” (p.881-883). A surveillance software, “Dark HunTor,” exposes user data from Tor, the browser meant to make searches anonymous, as well as from dark web searches for information. (p. 105) In addition, the U.S. Marshals Service Service “has created the Open-Source Intelligence Unit (OSINT) to proactively review and research social media content. OSINT identifies threats and situations of concern that may be currently undetected through traditional investigative methods. Analyzing public discourse on social media, its spread (‘likes,’ comments, and shares), and the target audience, the USMS can effectively manage its resources appropriate to the identified threats.” (p. 931) The DOJ release also includes details on biometric devices, from facial recognition software to other biometric identifiers, (p.353), as well as more than $10 million for “DNA Capability Expansion” (p.365). Is that all? Feel free to look for yourself. As is often said in Washington, never let a good crisis go to waste. The national security state is visibly winding up to expand surveillance of the American people in the wake of the posting of sensitive U.S. government secrets in a Discord chat room by a 21-year-old airman.
Officialdom’s appetite for more domestic surveillance was already evident before the leak with the introduction of the vaguely drafted Restrict Act. This bill, which has significant bipartisan support, would give the Commerce Secretary sweeping powers to regulate all communications technology and much of the content that it carries. That bill would hit those deemed to have violated unclear parameters of Restrict’s allowable behavior with $1 million fines and 20 years in prison. NBC News now reports that senior administration and congressional officials say the “Biden Administration is looking at expanding how it monitors social media sites and chatrooms.” The only problem with monitoring chatrooms is that they are private discussions. Forgive our quaintness, but systematic intrusion into all of America’s chatrooms by government-operated AI would be a massive violation of the Fourth Amendment. This would be an intrusion on such a scale as to trouble even many surveillance hawks. Consider former National Security Agency general counsel Glenn Gerstell, who has taken to the airwaves to tout the reauthorization of the highly problematic Section 702 of the Foreign Intelligence Surveillance Act. This is the authority that has been misused by the FBI to conduct backdoor searches of Americans’ communications. Even he sees the potential for overreaching here. Gerstell told NBC News: “We do not have nor do we want a system where the United States government monitors private internet chats.” Why, then, is this being considered? The government was mortified to learn that the leak had occurred and reported by The New York Times and open-source intelligence organization Bellingcat. It would be a serious error to respond to a crisis that resulted from a poorly-designed system of security within the government and treat it as reason to increase domestic surveillance of the American people. The FBI explained to Charlie Savage of The New York Times why it used the name of Rep. Darin LaHood (R-IL) as a search term. The FBI says it was conducting a “defensive” investigation ostensibly to protect the congressman. Along the way, the bureau took no trouble to adhere to rules that would have excluded Rep. LaHood’s personal and irrelevant communications when delving into his data collected under Section 702 of the Foreign Intelligence Surveillance Act (FISA).
In December, 2021, a government report first revealed that a Congressman’s name had been used in such a search without using minimization procedures to protect his privacy. That the subject of this surveillance was Rep. LaHood was dramatically revealed in a March hearing when the Illinois congressman said he believed his name had been used for the Section 702 query. Section 702 is an authority Congress authorized explicitly to surveil foreign actors in foreign settings who pose a threat to national security. The FBI is generous with itself in how it treats the collection of Americans’ communications that are “incidentally” swept up in 702 data collection. With so much of global communications running through North America – and so many Americans in communication with foreigners – the private messages of American citizens and people on U.S. soil have a degree of exposure far beyond anything Congress imagined when it amended FISA with Section 702 in 2008. This authority has since become a wide-open back door through which the FBI can surveil someone, then concoct a different predicate to follow up on the evidence it has seized. Years of experience with FBI misbehavior explains why Rep. LaHood, a former counterterrorism prosecutor, struck a newly confrontational tone in a recent hearing with FBI Director Christopher Wray. “I want to make clear the FBI's inappropriate querying of a duly elected member of Congress is egregious and a violation not only that degrades the trust in FISA but is viewed as a threat to the separation of powers," LaHood said to Director Wray. Now FBI backgrounders are telling The New York Times that the reason for the query was because the bureau believed Rep. LaHood was a target of a Chinese intelligence operation. FBI surveillance occurred at a time when LaHood, whose district includes soybean farmers and Caterpillar, was caught between President Trump’s tariffs imposed on Chinese goods and the dependence of his constituents on trade with China. Thus, intelligence community apologists are now using “defensive investigation” as yet another reason why we cannot allow a warrant requirement to gum up the works. Matt Olsen, now an assistant attorney general, argued in Slate a few years ago that entering an American’s email address or phone number into the database “is not the initiation of a new surveillance or search protected by the Fourth Amendment and subject to the warrant requirement. It is the review of information that the agency has already obtained by lawfully targeting others and that now resides in its databases.” This assertion is that these aren’t general warrants, prohibited by the Constitution, if the government already possesses your data. The founders added the Fourth Amendment to the Constitution to prevent general warrants like those of the British Crown. According to Olsen’s theory, if the king’s agents had thought to lock up every Bostonian’s private papers in a warehouse, it would have amounted to one, big legal search. A hypothetical situation shows how far afield this is from the Fourth Amendment. Put aside that Rep. LaHood has a reputation for being an honest and decent fellow. Hypothetically, would the FBI have ignored incriminating information of a non-national security crime if it had been found in a congressman’s private messages? Consider that the secret FISA court revealed that Section 702 has already been used in health care fraud, bribery, and other cases having nothing to do with national security. Now the FBI is peddling to The Times the notion that all is fair game if the purpose of the search is purely defensive. After all, they were merely trying to protect Rep. LaHood, right? But if that’s the case, why didn’t the FBI inform Rep. LaHood he was a target of the Chinese? Why did he have to intuit this from reading classified material years after the fact? The reason is clear. The government always wants to retain the right to go after the subject of the search. That is why the intelligence community and its apologists want an exception for backdoor searches but have no interest in a consent requirement. We hope Rep. LaHood keeps this in mind when he works with his colleagues to craft the strong reforms that, he said, must be the price of Section 702 reauthorization in this Congress. Targeted Journalists, Political Opponents, NGO Around the World Now another Israeli company joins the NSO Group for its flagrant disregard for human rights, democracy, and digital privacy in the name of profit.
QuaDream has been identified by The Citizen Lab at the Monk School of Global Affairs and Public Policy as the developer of a new spyware, Reign. Like the more notorious Pegasus, Reign infiltrates phones without requiring the target to click on a malicious link or to even take any action at all. Citizen Lab found that Reign can:
And when the job is complete, Reign self-destructs, removing most of the evidence that it was at ever at work in the victim’s phone. For decades, iPhone users enjoyed superior security. Reign took a big bite out of Apple’s vaunted security features. It infected some victims’ phones by sending them an iCloud invitation, following up on previous invitations, which makes the fake resend invisible to the user. Meanwhile, Google has issued some software patches to address vulnerabilities with its Android smartphone. Microsoft, which partnered with Citizen Lab, reported that the technology has been used to surveil journalists, political opposition figures, and an NGO in countries ranging from the Middle East to Central Europe and Latin America. We have seen time and again that commercially developed spyware finds its most lucrative market in sales to repressive governments and the world’s most dangerous criminal enterprises. While the Israeli government seems alert now to the threat posed by the commercial spyware sector, other actors around the world are surely poised to pick up the slack. The arms race between Apple, Google, and Samsung against spyware developers will continue apace. In the meantime, as former Vice President Nelson Rockefeller said: “If you don’t want it known, don’t say it over the phone.” Or anywhere within twenty feet of your smartphone. The New York Times broke the story that a front company in New Jersey signed a secret contract with the U.S. government in November 2021 to help it gain access to the powerful surveillance tools of Israel’s NSO Group.
PPSA previously reported that the FBI had acquired NSO’s signature technology, Pegasus, which can infiltrate a smartphone, strip all its data, and transform it into a 24/7 surveillance device. Mark Mazzetti and Ronen Bergman of The Times now report that the FBI in recent years had performed tests on defenses against Pegasus and “to test Pegasus for possible deployment in the bureau’s own operations inside the United States.” An FBI spokesperson told these journalists the FBI’s version of the software is now inactive. The secret contract also grants the U.S. government access to NSO’s powerful geolocation tool called Landmark. Mazzetti and Ronen report that such NSO technology has been used thousands of time against targets in Mexico – and that Mexico is named as a venue for the use of NSO technology. Two sources told the journalists that the “contract also allows for Landmark to be used against mobile numbers in the United States, although there is no evidence that has happened.” This story is catching the Biden Administration flat-footed, which had declared this technology a national security threat while placing NSO on a Commerce Department blacklist. In light of these new revelations, Members of Congress should ask the Directors of National Intelligence, the CIA, FBI, and DEA:
This breaking story will likely force the Biden White House to promulgate new rules limiting the use of NSO technology by federal law enforcement and intelligence agencies. As it does, Congress should be involved every step of the way. This technology is frightening because NSO tools can be installed remotely on smartphones with the most updated security software, and without the user succumbing to phishing or any other obvious form of attack. The need for a detailed policy limiting the use of these tools is urgent. NSO technology is to ordinary surveillance what nuclear weapons are to conventional weapons. Because nuclear weapons are hard to make, Washington, D.C. had time to plan and enact a global non-proliferation regime that delayed their proliferation. In the case of Pegasus and Landmark, however, this technology easily proliferated in the wild before Washington was even fully aware of its existence. Pegasus has been used by drug cartels to track down and murder journalists. It has been used by an African government to listen in on conversations between the daughter of a kidnapped man and the U.S. State Department. It was famously used to plan the murder of Adnan Khashoggi. Does anyone doubt that Russian and Chinese intelligence have secured their own copies? Now Washington is both racing to catch up with foreign adversaries and limit the use of this technology at the same time. NSO, through its amoral proliferation of dangerous technology, has made the world a riskier place. As federal agencies seek to get their hands on this technology, Congress should paint a bright red line – DO NOT USE DOMESTICALLY, EVER. Never let a moral panic go to waste. A legitimate concern – the likely exposure of 150 million Americans’ data from TikTok to the People’s Republic of China – somehow morphed into the Restrict Act, which never mentions TikTok.
Supported across the ideological spectrum by Sens. Mark Warner, Joe Manchin, Mitt Romney, and Shelley Moore Capito, the Restrict Act would transform a rather innocuous figure, the Secretary of Commerce, into an American Cardinal Richelieu. The bill would empower the Secretary of Commerce “to review and prohibit certain transactions between persons in the United States and foreign adversaries” regarding virtually all hardware and software communications technology, as well as data storage, machine learning, predictive analytics, and data science providers. The bill also covers software for desktops, mobile applications, gaming, payment, and web-based applications. The bill labels the People’s Republic of China, as well as Cuba, Iran, Russia, North Korea, and Venezuela as foreign adversaries, which they certainly are. But it would grant the Secretary the power to consult with the Director of National Intelligence to name any country’s technology as a national security threat. In addition, under the bill the Secretary would have the power to ban “entities” held by hostile foreign labor unions, equity investors, partnerships or of “any participation […] and of any character.” It covers just about every aspect of technology and e-commerce with a potentially foreign connection. The Secretary of Commerce, usually known for cutting ribbons and handing out awards, would acquire duties commonly associated with the CIA’s Clandestine Service. Commerce Secretary Gina Raimondo would be empowered to “identify, deter, disrupt, prevent, prohibit, investigate, or otherwise mitigate” not just the “information and communications technology products” listed above, but also anything that could be construed to involve “Federal elections” and “national security.” The bill also targets “the digital economy,” presumably meaning the Secretary of Commerce could, with the stroke of a pen and no further debate in Congress, deter, disrupt, or prevent Bitcoin and other cryptocurrencies. Americans who violate these restrictions would face up to $1 million in fines and 20 years in prison. And violate what, exactly? This bill is as vague as it is repressive. As Elizabeth Nolan Brown of Reason notes, the Restrict Act could easily be interpreted as criminalizing virtual private networks, which enable Americans’ privacy. Thus, teenagers who use VPNs to watch burping contests on TikTok could face a $1 million fine and 20 years in prison. “We’ve seen many times the way federal laws are sold as attacks on big baddies like terrorists and drug kingpins yet wind up used to attack people engaged in much more minor activities,” Brown wrote. Or, as Cardinal Richelieu said, “If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him.” As extreme as it is, it would be a mistake to discount the Restrict Act. The White House is strongly backing it. On Capitol Hill, this legislation has momentum, surfing on the cresting wave of indignation after the poorly received testimony of TikTok’s CEO. And if it falls short, it reveals a deep state hunger for power that will surely find expression in smaller, more passable bills. Which raises the question: If we reject the Restrict Act, what should be done about TikTok? As civil libertarians, we find ourselves at a point of agony concerning the proposed banning of TikTok’s social media platform in the United States. We don’t like government having the power to pull down a vibrant ecosystem of speech, one with many minority viewpoints, and on which small businesses and influencers depend. Yet TikTok’s promise to quarantine Americans’ data from ByteDance, its Chinese owner, is risible. ByteDance must comply with a Chinese law that mandates sharing data with Chinese intelligence. As we’ve pointed out, TikTok has repeatedly violated its own standards – most recently by allegedly surveilling American journalists in a likely attempt to catch dissidents or whistleblowers. Nor would “Project Texas” – TikTok’s proposal to house American data, most likely with Oracle in Austin – be a foolproof way to quarantine Americans’ data from China. Vigilance about government surveillance must include all governments. As overbearing as Washington can be, nothing compares to the malevolence of the Chinese Communist Party toward the United States and the American people. For that reason, it makes sense to ban TikTok or force a sale under current sanctions law or a narrowly targeted bill. But the Restrict Act gives overreaction a bad name. It would be a rich irony if Congress protected Americans from the importation of Chinese surveillance by turning the Department of Commerce into the technology police, enforcing vague laws with sweeping investigatory power. The Restrict Act is a blueprint for tyranny. Realtors will tell you that the price of a home is all about location, location, location. But in surveillance policy, location is not the only thing that matters.
In a recent Senate hearing, Sen. Ron Wyden (D-OR) asked FBI Director Christopher Wray about reports the Bureau was purchasing Americans’ location data. Wray replied that the FBI does not “purchase communications database information that includes location data derived from internet advertisement.” The FBI, Director Wray explained, did purchase such information at some unspecified time in the past, but that was part of a since-discontinued pilot program. A few days later, Department of Justice Inspector General Michael Horowitz testified on the Hill that warrantlessly purchasing location data, in the wake of the 2018 Supreme Court Carpenter opinion, should be considered off-limits. So far, so good. But what about the questions not asked? Our devices generate a lot more information about us than just our location and movements. Data reveal our networks of friends and associates, political beliefs, religious beliefs and worship, sexual lives and preferences, and other deeply sensitive information – the sort of “data” that snoops once had to pick the lock of a diary to learn. The first of these other questions we’d love to ask Director Wray: Is the Bureau still purchasing other sensitive data on Americans? This question comes to mind after Vice’s Motherboard tech blog revealed a contract showing that in 2017 the FBI paid more than $76,000 through a middleman to purchase “netflow” data from a data broker, Team Cymru, which obtained it from internet service providers. This purchase for “netflow” data can include which server communicated with another, giving the FBI the ability to track internet traffic through virtual private networks. It can include websites visited and cookies, digital details that can collectively form a portrait of a user. This purchase was made for the FBI’s Cyber Crime division in 2017. Some more questions for Director Wray:
It was recently revealed that the FBI made 204,090 U.S. person queries from NSA databases – equivalent to a warrantless search of every citizen of Richmond, Virginia. Director Wray should face these questions in his next hearing. A fuller explanation of what kinds of warrantless data the FBI extracts and uses is, after all, minimal for Congressional oversight. |
Categories
All
|