Americans value privacy in the marketplace when we vote with our dollars no less than when we go behind the curtains of a polling booth. Now imagine if every dollar in our possession came with an RFID chip, like those used for highway toll tags or employee identification, telling the government who had that dollar in their hands, how that consumer spent it, and who acquired it next. That would be the practical consequence of a policy proposal being promoted now in Washington, D.C., to enact a Central Bank Digital Currency (CBDC). Some have recently asked Congress to attach such a currency to the Bank Secrecy Act, to enable surveillance of every transaction in America. Such a measure would end all financial privacy, whether a donation to a cause, or money to a friend. “If not designed to be open, permissionless, and private – resembling cash – a government-issued CBDC is nothing more than an Orwellian surveillance tool that would be used to erode the American way of life,” said Rep. Tom Emmer (R-MN). This would happen because CBDC is a digital currency, issued on a digital ledger under government control. It would give the government the ability to surveil Americans transactions and, in the words of Rep. Emmer, “choke out politically unpopular activity.” The good news is that President Trump is alert to the dangers posed by a CBDC. One of his first acts in his second term was to issue an executive order forbidding federal agencies from exploring a CBDC. But the hunger for close surveillance of Americans’ daily business by the bureaucracy in Washington, D.C., is near constant. There is no telling what future administrations might do. Rep. Emmer reintroduced his Anti-Surveillance State Act to prevent the Fed from issuing a CBDC, either directly or indirectly through an intermediary. Rep. Emmer’s bill also would prevent the Federal Reserve Board from using any form of CBDC as a tool to implement monetary policy. The bill ensures that the Treasury Department cannot direct the Federal Reserve Bank to design, build, develop, or issue a CBDC. Prospects for this bill are good. Rep. Emmer’s bill passed the House in the previous Congress. It doesn’t hurt that Rep. Emmer is the House Majority Whip and that this bill neatly fits President Trump’s agenda. So there is plenty of reason to be hopeful Americans will be permanently protected from a surveillance currency. But well-crafted legislation alone won’t prevent the federal bureaucracy from expanding financial surveillance, as it has done on many fronts. PPSA urges civil liberties groups and Hill champions of surveillance reform, of all political stripes and both parties, to unite behind this bill. We’re not sure which is most disconcerting: that Meta has a division named Global Threat Disruption, that their idea of said global threats includes deepfake celebrity endorsements, or that this has become their excuse to reactivate the controversial facial recognition software they shelved just three years earlier (so much for the “Delete” key). Meta has relaunched DeepFace to defend against celebrity deepfakes in South Korea, Britain, and even the European Union. “Celeb-baiting,” as it’s known, is where scammers populate their social media posts with images or AI-generated video of public figures. Convinced that they’re real – that Whoopi Goldberg really is endorsing a revolutionary weight loss system, for example – unwitting victims fork over their data and money with just a few clicks. All of which, according to Meta, “is bad for people that use our products.” Celeb-baiting is a legitimate problem, to be sure. We’re no fans of social media scammers. What’s more, we know full well that “buyer beware” is meaningless in a world where it is increasingly difficult to spot digital fakes. But in reviving their facial recognition software, Meta may be rolling out a cannon to kill a mosquito. The potential for collateral damage inherent in this move is, in a word, staggering. Just ask the Uighurs in Xi’s China. Meta began tracking the faces of one billion users, beginning in 2015. And initially, it didn’t bother to tell people the technology was active, so users couldn’t opt out. As a result of Meta’s sleight of hand, as well as its own strict privacy laws, the EU cried foul and banned DeepFace from being implemented. But that was years ago … and how times have changed. The privacy-minded Europeans are now letting Meta test DeepFace to help public figures guard against their likenesses being misused. But can regular users be far behind? Meta could rebuild its billion-face database in no time. For its part, the U.K. is courting artificial intelligence like never before, declaring that it will help unleash a “decade of national renewal.” Even for a country that never met a facial recognition system it didn’t love, this feels like a bridge too far. We have written about the dangers, both real and looming, of a world in which facial recognition technology has become ubiquitous. When DeepFace was shelved in 2021, it represented an almost unheard-of reversal, in effect putting the genie (Mark Z, not Jafar) back in the bottle. That incredibly lucky bit of history is unlikely to repeat itself. Genies never go back in their bottles a second time. As Americans become aware – and concerned – about how our most sensitive and private digital information is sold by data brokers, there are stirrings within the federal government to place at least some guardrails on the practice. In a unanimous, bipartisan vote last week by the commissioners of the Federal Trade Commission, that agency cracked down on two data brokers, Mobilewalla and Gravy Analytics/Venntel, for unlawfully tracking and selling sensitive data. FTC declared that this data “not only compromised consumers’ personal privacy, but exposed them to potential discrimination, physical violence, and other harms …” Such practices included matching consumers’ identities with location data from health clinics, religious organizations, labor union offices, LGBTQ+-related locations, political gatherings, and military installations. By conducting real-time bidding exchanges, these brokers combined data from these auctions with data from other sources, to identify users at these locations by their mobile advertising IDs. Just days before, the Consumer Financial Protection Bureau proposed a rule that would prevent data brokers from collecting and selling sensitive personal information such as phone numbers and Social Security numbers, as well as personal financial information outside of relevant contexts, like a mortgage application. CFPB’s action also seeks to prevent the sale of the information of Americans in the military or involved in national security to “scammers, stalkers, and spies.” We applaud these bold bipartisan moves by FTC and CFPB, but we must keep in mind that these are first steps. These actions will only marginally address the vast sea of personal information sold by data brokers to all sorts of organizations and governments, including our own. There is throughout our government a failure to fully appreciate just how intrusive the mass collection of personal data actually is. Consider the reaction of Republican FTC Commissioner Andrew Ferguson. While mostly voting with the majority, Ferguson dissented on the breadth of the majority’s take on sensitive categories. Ferguson sees no distinction between the exposure of one’s digital location history and what can be learned by a private detective following a target across public spaces, a practice that is perfectly legal. Ferguson reasoned that many people are an open book about their health conditions, religion, and sexual orientation. “While some of these characteristics often entail private facts, others are not usually considered private information,” Ferguson wrote. “Attending a political protest, for example, is a public act.” We beg to differ. “A private detective could find this out” is too weak a standard to apply to the wealth of digital data on the privacies of millions of people’s lives. Data is different. As the Supreme Court explained in Riley v. California, “a cell phone search would typically expose to the government far more than the most exhaustive search of [even] a house: A phone not only contains in digital form many sensitive records previously found in the home; it also contains a broad array of private information never found in a home in any form – unless the phone is.” That was true when it was written in 2014, and it is even more true today. Nowadays, artificial intelligence can analyze data and reveal patterns that no gumshoe could put together. In the case of a political protest, a high school student might attend, say, a trans rights event but be far from ready to let his parents or peers know about it. Or an adherent of one religion may attend services of an entirely different religion with conversion in mind but be far from willing to tell relatives. Worse, deeply personal information in the hands of prosecutors completely bypasses the letter and the intent of the Fourth Amendment, which requires the government to get a probable cause warrant before using our information against us. The government lacks appreciation of its own role in sweeping in the sensitive data of Americans. Venntel’s customers include the Department of Homeland Security, the Drug Enforcement Administration, the FBI, and the IRS. In all, about a dozen federal law enforcement and intelligence agencies purchase such data from many brokers and hold it for warrantless inspection. The FTC deserves credit for taking this step to tighten up the use of sensitive information. But the next step must be passage of the Fourth Amendment Is Not for Sale Act, which would require the government to obtain probable cause warrants before obtaining and using our most personal information against us. Investigative journalist Ronan Farrow delves into the Pandora’s box that is Israel’s NSO Group, a company (now on a U.S. Commerce Department blacklist) that unleashes technologies that allow regimes and cartels to transform any smartphone into a comprehensive spying device. One NSO brainchild is Pegasus, the software that reports every email, text, and search performed on smartphones, while turning their cameras and microphones into 24-hour surveillance devices. It’s enough to give Orwell’s Big Brother feelings of inadequacy. Farrow covers well-tread stories he has long followed in The New Yorker, also reported by many U.S. and British journalists, and well explored in this blog. Farrow recounts the litany of crimes in which Pegasus and NSO are implicated. These include Saudi Arabia’s murder of Jamal Khashoggi, the murder of Mexican journalists by the cartels, and the surveillance of pro-independence politicians in Catalonia and their extended families by Spanish intelligence. In the latter case, Farrow turns to Toronto-based Citizen Lab to confirm that one Catalonian politician’s sister and parents were comprehensively surveilled. The parents were physicians, so Spanish intelligence also swept up the confidential information of their patients as well. While the reality portrayed by Surveilled is a familiar one to readers of this blog, it drives home the horror of NSO technology as only a documentary with high production values can do. Still, this documentary could have been better. The show is marred by too many reaction shots of Farrow, who frequently mugs for the camera. It also left unasked follow-up questions of Rep. Jim Himes (D-CT), Ranking Member of the House Intelligence Committee. In his sit-down with Farrow, Himes made the case that U.S. agencies need to have copies of Pegasus and similar technologies, if only to understand the capabilities of bad actors like Russia and North Korea. Fair point. But Rep. Himes seems oblivious to the dangers of such a comprehensive spyware in domestic surveillance. Rep. Himes says he is not aware of Pegasus being used domestically. It was deployed by Rwandan spies to surveil the phone of U.S. resident Carine Kanimba in her meetings with the U.S. State Department. Kanimba was looking for ways to liberate her father, settled in San Antonio, who was lured onto a plane while abroad and kidnapped by Rwandan authorities. Rep. Himes says he would want the FBI to have Pegasus at its fingertips in case one of his own daughters were kidnapped. Even civil libertarians agree there should be exceptions for such “exigent” and emergency circumstances in which even a warrant requirement should not slow down investigators. The FBI can already track cellphones and the movements of their owners. If the FBI were to deploy Pegasus, however, it would give the bureau redundant and immense power to video record Americans in their private moments, as well as to record audio of their conversations. Rep. Himes is unfazed. When Farrow asks how Pegasus should be used domestically, Rep. Himes replies that we should “do the hard work of assessing that law enforcement uses it consistent with our civil liberties.” He also spoke of “guardrails” that might be needed for such technology. Such a guardrail, however, already exists. It is called the Fourth Amendment of the Constitution, which mandates the use of probable cause warrants before the government can surveil the American people. But even with probable cause, Pegasus is too robust a spy tool to trust the FBI to use domestically. The whole NSO-Pegasus saga is just one part of much bigger story in which privacy has been eroded. Federal agencies, ranging from the FBI to IRS and Homeland Security, purchase the most intimate and personal digital data of Americans from third-party data brokers, and review it without warrants. Congress is even poised to renege on a deal to narrow the definition of an “electronic communications service provider,” making any office complex, fitness facility, or house of worship that offers Wi-Fi connections to be obligated to secretly turn over Americans’ communications without a warrant. The sad reality is that Surveilled only touches on one of many crises in the destruction of Americans’ privacy. Perhaps HBO should consider making this a series. They would never run out of material. Catastrophic ‘Salt Typhoon’ Hack Shows Why a Backdoor to Encryption Would be a Gift to China11/25/2024
Former Sen. Patrick Leahy’s Prescient Warning It is widely reported that the breach of U.S. telecom systems allowed China’s Salt Typhoon group of hackers to listen in on the conversations of senior national security officials and political figures, including Donald Trump and J.D. Vance during the recent presidential campaign. In fact, they may still be spying on senior U.S. officials. Sen. Mark Warner (D-VA), Chairman of the Senate Intelligence Committee, on Thursday said that China’s hack was “the worst telecom hack in our nation’s history – by far.” Warner, himself a former telecom executive, said that the hack across the systems of multiple internet service providers is ongoing, and that the “barn door is still wide open, or mostly open.” The only surprise, really, is that this was a surprise. When our government creates a pathway to spy on American citizens, that same pathway is sure to be exploited by foreign spies. The FBI believes the hackers entered the system that enables court-ordered taps on voice calls and texts of Americans suspected of a crime. These systems are put in place by internet service providers like AT&T, Verizon, and other telecoms to allow the government to search for evidence, a practice authorized by the 1994 Communications Assistance for Law Enforcement Act. Thus the system of domestic surveillance used by the FBI and law enforcement has been reverse-engineered by Chinese intelligence to turn that system back on our government. This point is brought home by FBI documents PPSA obtained from a Freedom of Information Act request that reveal a prescient question put to FBI Director Christopher Wray by then-Sen. Patrick Leahy in 2018. The Vermont Democrat, now retired, anticipated the recent catastrophic breach of U.S. telecom systems. In his question to Director Wray, Sen. Leahy asked: “The FBI is reportedly renewing a push for legal authority to force decryption tools into smartphones and other devices. I am concerned this sort of ‘exceptional access’ system would introduce inherent vulnerabilities and weaken security for everyone …” The New York Times reports that according to the FBI, the Salt Typhoon hack resulted from China’s theft of passwords used by law enforcement to enact court-ordered surveillance. But Sen. Leahy correctly identified the danger of creating such domestic surveillance systems and the next possible cause of an even more catastrophic breach. He argued that a backdoor to encrypted services would provide a point of entry that could eventually be used by foreign intelligence. The imperviousness of encryption was confirmed by authorities who believe that China was not able to listen in on conversations over WhatsApp and Signal, which encrypt consumers’ communications. While China’s hackers could intercept text messages between iPhones and Android phones, they could not intercept messages sent between iPhones over Apple’s iMessage system, which is also encrypted. Leahy asked another prescient question: “If we require U.S. technology companies to build ‘backdoors’ into their products, then what do you expect Apple to do when the Chinese government demands that Apple help unlock the iPhone of a peaceful political or religious dissident in China?” Sen. Leahy was right: Encryption works to keep people here and abroad safe from tyrants. We should heed his warning – carving a backdoor into encrypted communications creates a doorway anyone might walk through. The CFPB Curbs Worker Surveillance – Will the Government Live Up to Its Own Privacy Standards?10/31/2024
The Consumer Financial Protection Bureau (CFPB) is warning businesses that use of “black-box AI” or algorithmic scores about workers must be consistent with the rules of the Fair Credit Reporting Act. This means employers must obtain workers’ consent, provide transparency when data is used for an adverse decision, and make sure that workers have a chance to dispute inaccurate reports. That’s a good move for privacy, as far as it goes. The problem is, it doesn’t go nearly far enough because the federal government doesn’t impose these same standards on itself. First, PPSA agrees with the tightening of employers’ use of digital dossiers and AI monitoring. Whenever someone applies for a job, the prospective employer will usually perform a search about them on a common background-check site. It is not surprising that businesses want to know about applicants’ credit histories, to check on their reliability and conscientiousness, and if they have a possible criminal past. But third-party consumer reports offer much more than those obvious background checks. Some sites, for example, are used to predict the likelihood that you might favor union membership. More invasive still are apps that many employers are requiring new employees to install on personal phones to monitor their conduct and assess their performance. The decision to reassign employees, promote or demote them, or fire them are coming from automated systems, decisions made by machines that often lack context or key information. Federal agencies, from the CFPB to the Federal Trade Commission, have not been shy about calling out privacy violations like these of some businesses for years now. Too bad our government cannot live up to its own high standards. The government freely acknowledges that a dozen agencies – ranging from the FBI to the IRS, Department of Homeland Security, and the Pentagon – routinely buy the most intimate and personal data of Americans scraped from our apps and sold by shadowy data brokers. The data the government collects on us is far more extensive than anything a commercial data aggregator could find. The government can track our web browsing, those we communicate with, what we search for online, and our geolocation histories. This is far more invasive and intrusive than anything private businesses are doing in screening applicants and monitoring employees. Worse, the government observes no obligation to reveal how this data might be used to compile evidence against a criminal defendant in a courtroom, or if agencies are using purchased data to create dossiers on Americans to predict their future behavior. There is no equivalent of the Fair Credit Reporting Act when it comes to the government’s use of our data. But there is the Fourth Amendment Is Not For Sale Act, a bill that would require the government to obtain a probable cause warrant – as required by the Constitution – before inspecting our digital lives. The Fourth Amendment Is Not For Sale Act passed the House this year and awaits action in the U.S. Senate. Passing it in the coming lame-duck session would be one way to remove the hypocrisy of the federal government on the digital surveillance of American workers, consumers, and citizens. Doxing – the practice of exposing a person’s location and home address – can have deadly consequences. This lesson was brought home in July 2020 when a deranged man with a grudge against federal judge Esther Salas went to her New Jersey home dressed as a deliveryman, carrying a gun. The judge’s 20-year-old son, Daniel Anderl, a Catholic University student, opened the door only to be shot dead as he moved forward to shield his parents. Out of this tragedy came Daniel’s Law, a New Jersey statute advocated by Judge Salas to allow law enforcement, government personnel, judges and their families to have their information completely removed from commercial data brokers. We’re accustomed to the idea that ad-selling social media platforms and government can track us. Now Krebs on Security is reporting that a new digital service neuters this law and exposes potentially any American to location tracking by any subscriber. This tracking service is enabled by Babel Street, which has a core product that Krebs writes “allows customers to draw a digital polygon around nearly any location on a map of the world, and view a . . . time-lapse history of the mobile devices coming in and out of the specified area.” Krebs reports that a private investigator demonstrated the danger of this technology by discreetly using it to determine the home address and daily movements of mobile devices belonging to multiple New Jersey police officers whose families have already faced significant harassment and death threats. This is just one more sign that in-depth surveillance that was once the province of giant social media companies and state actors is falling into the hands of garden-variety stalkers, snoops, and criminals. PPSA calls on New Jersey legislators, who are ideally positioned to lead a national response to this technology, to develop laws and policy solutions that continue to protect law enforcement, judges, and everyday citizens in their daily rounds and in their homes. Police Chief: “A Nice Curtain of Technology”We’ve long followed the threat to privacy from the proliferation of automated license plate readers (ALPRs). Now the Institute for Justice has filed a lawsuit against the Norfolk, Virginia, police department for its use of this Orwellian technology. More than 5,000 communities across the country have installed the most popular ALPR brand, Flock, which records and keeps the daily movements of American citizens driving in their cars. Norfolk is an enthusiastic adopter of Flock technology, with a network of 172 advanced cameras that make it impossible for citizens to go anywhere in their city without being followed and recorded. Flock applies artificial intelligence software to its national database of billions of images, adding advanced search and intelligence functions. “This sort of tracking that would have taken days of effort, multiple officers, and significant resources just a decade ago now takes just a few mouse clicks,” the Institute for Justice tells a federal court in its lawsuit. “City officers can output a list of locations a car has been seen, create lists of cars that visited specific locations, and even track cars that are often seen together.” No wonder the Norfolk police chief calls Flock’s network “a nice curtain of technology.” The Institute for Justice has a different characterization, calling this network “172 unblinking eyes.” Americans are used to the idea of being occasionally spotted by a friend or neighbor while on the road, but no one expects to have every mile of one’s daily movements imaged and recorded. The nefarious nature of this technology is revealed in the concerns of the two Norfolk-area plaintiffs named in the lawsuit.
“If the Flock cameras record Lee going straight through the intersection outside his neighborhood, for example, the NPD (Norfolk Police Department) can infer that he is going to his daughter’s school. If the cameras capture him turning right, the NPD can infer that he is going to the shooting range. If the cameras capture him turning left, the NPD can infer that he is going to the grocery store […] “Lee finds all of this deeply intrusive. Even if ordinary people see him out and about from time to time, Lee does not expect and does not want people – much less government officials – tracking his every movement over 30 days or more and analyzing that data the way the Flock cameras allow the NPD and other Flock users to do.”
“As a healthcare worker, Crystal is legally and ethically required to protect her clients’ privacy,” the filing states. “She also understands that her clients expect her to maintain their confidentiality … If she failed to live up to those expectations, her business would suffer.” Both plaintiffs are concerned another Flock user, perhaps a commercial entity, might misuse the records of their movements. They are also worried about “the potential that Defendants, Flock users, or third-party hackers could misuse her information.” No warrants or permissions are needed for Norfolk officers to freely access the system. The Institute for Justice was shrewd in its selection of venues. Norfolk is in the jurisdiction of the federal Fourth Circuit Court of Appeals, which in 2021 struck down the use of drone images over the city in a case called Beautiful Struggle v. Baltimore Police Department. “The Beautiful Struggle opinion was about a relatively, comparatively, crude system, just a drone that was flying in the air for 12 hours a day that at most had a couple of pixels that made it hard to identify anyone,” Institute for Justice attorney Robert Frommer told 404 Media. “By contrast, anyone with the Flock cameras has a crystal-clear record of your car, a digital fingerprint that can track anywhere you go. The police chief even said you can’t really go anywhere in Norfolk without being caught by one of these cameras.” The consistent principle from the Fourth Circuit’s precedent should make it clear, in the words of the Institute for Justice, that tracking a driver “to church, to a doctor’s office, to a drug-abuse treatment clinic, to a political protest,” is unconstitutional. Government Promises to Protect Personal Data While Collecting and Using Americans’ Personal Data10/21/2024
Digital data, especially when parsed through the analytical lens of AI, can detail almost every element of our personal lives, from our relationships to our location histories, to data about our health, financial stability, religious practices, and political beliefs and activities.
A new blog post from the White House details a Request for Information (RFI) from OMB’s Office of Information and Regulatory Affairs (OIRA) seeking to get its arms around this practice. The RFI seeks public input on “Federal agency collection, processing, maintenance, use, sharing, dissemination, and disposition of commercially available information (CAI) containing personally identifiable information (PII).” In plain language, the government is seeking to understand how agencies – from the FBI to the IRS, the Department of Homeland Security, and the Pentagon – collect and use our personal information scraped from our apps and sold by data brokers to agencies. This request for public input follows last year’s Executive Order 14110, which represented that “the Federal Government will ensure that the collection, use, and retention of data is lawful, is secure, and mitigates privacy and confidentiality risks.” What to make of this? On the one hand, we commend the White House and intelligence agencies for being proactive for once on understanding the privacy risks of the mass purchase of Americans’ data. On the other hand, we can’t shake out of our heads Ronald Reagan’s joke about the most terrifying words in the English language: “I’m from the government and I’m here to help.” The blog, written by OIRA administrator Richard L. Revesz, points out that procuring “CAI containing PII from third parties, such as data brokers, for use with AI and for other purposes, raises privacy concerns stemming from a lack of transparency with respect to the collection and processing of high volumes of potentially sensitive information.” Revesz is correct that AI elevates the privacy risks of data purchases. The government might take “additional steps to apply the framework of privacy law and policy to mitigate the risks exacerbated by new technology.” Until we have clear rules that expressly lay out how CAI is acquired and managed within the executive branch, you’ll forgive us for withholding our applause. This year’s “Policy Framework for Commercially Available Information” released by Director of National Intelligence Avril Haines, ordered all 18 intelligence agencies to devise safeguards “tailored to the sensitivity of the information” and produce an annual report on how each agency uses such data. It is hard to say if Haines’ directive represents a new awareness of the Orwellian potential of these technologies, or if they are political theater to head off legislative efforts at reform. Earlier this year, the U.S. House of Representatives passed the Fourth Amendment Is Not For Sale Act, which would subject purchased data to the same standard as any other personal information – a probable cause warrant. The Senate should do the same. The government’s recognition of the sensitivity of CAI and accompanying PII is certainly a step in the right direction. It is also clear that intelligence agencies have every intention of continuing to utilize this information for their own purposes, despite lofty proclamations and vague policy goals about Americans’ privacy. To quote Ronald Reagan again, when it comes to the promises of the intel community, we should “trust but verify.” A Federal Trade Commission staff report released last week got huge play in the media. We were bombarded by stories about the FTC’s report that Meta, YouTube, and other major social media and video streaming companies are lax in controlling and protecting the data privacy of users, especially children and teens.
There is much in this report to consider, especially where children are concerned. But there was also a lot that was off-target and missing. The FTC’s report blithely recommended that social media and video streaming companies abandon their practice of tracking users’ data. This would be no small thing. Without the tracking that allows Facebook to know that you’re an aficionado of, say, old movie posters, you would not receive ads in your feed trying to sell you just that – old movie posters. Forbid the trade-off in which we give away a bit of our privacy for a free service, and overnight large social media companies would collapse. Countless small businesses would lose the ability to go toe-to-toe with big brands. Trillions of dollars in equity would evaporate, degrading the portfolio of retirees and putting millions of Americans out of work. In a crisply written concurring and dissenting statement, FTC Commissioner Andrew Ferguson notes that the FTC report “reveals this mass data collection has been very difficult to avoid. Many of these products are necessities of modern life. They are critical access points to markets, social engagement, and civil society.” Ferguson looks beyond what the advertising logarithms of Meta or Google do with our data. He looks to how our data is combined with information from a host of sources, including our location histories from our smartphones, to enable surveillance. It is this combination of data, increasingly woven by AI, that creates such comprehensive portraits of our activities, beliefs and interests. These digital dossiers can then be put up for sale by a third-party data broker to any willing buyer. Ferguson writes: “Sometimes this information remains internal to the company that collected it. But often, they share the information with affiliates or other third parties, including entities in foreign countries like China, over which the collecting company exercises no control. This information is often retained indefinitely, and American users generally have no legal right to demand that their personal information be deleted. Companies often aggregate and anonymize collected data, but the information can often be reassembled to identify the user with trivial effort. “This massive collection, repackaging, sharing, and retention of our private and intimate details puts Americans at great risk. Bad actors can buy or steal the data and use them to target Americans for all sorts of crimes and scams. Others, including foreign governments who routinely purchase Americans’ information, can use it to damage the reputations of Americans by releasing, or threatening to release, their most private details, like their browsing histories, sexual interests, private political views, and so forth.” We would add that the FBI, IRS, and a host of other federal law enforcement and intelligence agencies also purchase our “dossiers” and access them without warrants. As dangerous as China is, it cannot send a SWAT team to break down our doors at dawn. Only our government can do that. The FTC report ignores this concern, focusing on the commercial abuses of digital surveillance while ignoring its usefulness to an American surveillance state. It is no small irony that a federal government report on digital surveillance doesn’t concern itself with how that surveillance is routinely abused by government. This insight gives us all the more reason to urge the U.S. Senate to follow the example of the House and pass the Fourth Amendment Is Not For Sale Act. This legislation requires the FBI and other federal agencies to obtain a warrant before they can purchase Americans’ personal data, including internet records and location histories. It is also time for Congress to shine a bright light on data brokers to identify all the customers – commercial, foreign, and federal – who are watching our digital lives. In George Orwell’s Nineteen Eighty-Four, the walls of every domicile in Oceania bristle with microphones and cameras that catch the residents’ every utterance and action. In 2024, we have done Big Brother’s work for him. We have helpfully installed microphones and cameras around the interior of our homes embedded in our computers, laptops, smartphones, and tablets. Might someone be selling our conversations to companies and the federal government without our consent?
Few worry about this because of explicit promises by tech companies not to enable their microphones to be used against us. Google, Amazon, Meta are firm in denying that they eavesdrop on us. For example, Meta states that “sometimes ads can be so specific, it seems like we must be listening to your conversations through our microphones, but we’re not.” Still, many of us have had the spooky sensation of talking about something random but specific – perhaps a desire to buy a leather couch or take a trip to Cancun – only to find our social media feeds littered with ads for couches and resorts in Cancun. The tech companies’ explanation for this is that we sometimes perform online searches for things, forget about them, and then mistakenly attribute the ads in our social media feeds to a conversation. We hope that’s the case. But now we’re not so sure. 404 Media has acquired a slide deck from Cox Media Group (CMG) that claims its “Active-Listening” software can combine AI with our private utterances captured by 470-plus sources to “improve campaign deployment, targeting and performance.” One CMG slide says, “processing voice data with behavioral data identifies an audience who is ‘ready to buy.’” CMG claims to have Meta’s Facebook, Google, and Amazon as clients. After this story broke, the big tech companies stoutly denied that they engage in this practice and expressed their willingness to act against any marketing partner that eavesdrops. This leaves open the possibility that CMG and other actors are gathering voice data from microphones other than from those of their big tech clients. What these marketers want to do is to predict what we will want and send us an ad at the precise time we’re thinking about a given product. The danger is that this same technology in the hands of government could be used to police people at home. This may sound outlandish. Yet consider that a half-dozen federal agencies – ranging from the FBI to the IRS – already routinely purchase our geolocation, internet activity, and other sensitive information we generate on our social media platforms – and then access it freely, without a warrant. Considering what our government already does with our digital data, the addition of our home speech would be an extension of what is already a radical new form of surveillance. Congress should find out exactly what marketers like CMG are up to. As an urgent matter of oversight, Congress also should also determine if any federal agencies are purchasing home voice data. And while they’re at it, the Senate should follow the example of the House and pass the Fourth Amendment Is Not For Sale Act, which would stop the practice of the warrantless purchasing of Americans’ personal, digital information by law enforcement and intelligence agencies. The U.S. Department of Justice is pioneering ever-more dismissive gestures in its quest to fob off lawful Freedom of Information Act (FOIA) requests seeking to shed light on government surveillance. One PPSA FOIA request, aimed at uncovering details about the DOJ's purchase of Americans’ commercially available data from third-party data brokers, sets a new record for unprofessionalism.
Until now, we had become used to the Catch-22 denials in which the government refuses to even conduct a search for responsive records with a Glomar response. This judge-made doctrine allows the withholding of requested information if it is deemed so sensitive that the government can neither confirm nor deny its existence. But when the government issues a Glomar response without first conducting a search, we can only ask: How could they know that if they haven’t even searched for the records? DOJ’s latest response that arrived this week, however, is a personal best. The DOJ’s response shows that it didn’t bother to even read our FOIA request. Our request sought records detailing the DOJ's acquisition of data on U.S. persons and businesses, including the amounts spent, the sources of the data, and the categories of information obtained. This request was clearly articulated and included a list of DOJ components likely to have the relevant records. Despite this clarity, DOJ responded by stating that the request did not sufficiently identify the records. DOJ's refusal to conduct a proper search appears to be based on a misinterpretation, either genuine or strategic, of our request. DOJ claimed an inability to identify the component responsible for handling a case based solely on the “name” of the case or organization. However, PPSA's request did not rely on any such identifiers. Instead, DOJ's response indicates that it may have resorted to a generic form letter to reject our request without actually reviewing its contents. Precedents like Miller v. Casey and Nation Magazine v. U.S. Customs Service establish that an agency must read requests “as drafted” and interpret them in a way that maximizes the likelihood of uncovering relevant documents. DOJ’s blanket dismissal is not just a bureaucratic oversight. It is an affront to the principles of openness and accountability that FOIA is designed to uphold. If the DOJ, the agency responsible for upholding the law, continues to disregard its legal obligations, it sets a dangerous precedent for all government agencies. The good news is that DOJ’s Office of Information Policy has now ordered staff to conduct a proper search in response to PPSA’s appeal, a directive that should have been unnecessary. It remains to be seen whether the DOJ will comply meaningfully or continue to obstruct … perhaps with another cookie-cutter Glomar response. How far might DOJ go to withhold basic information about its purchasing of Americans’ sensitive and personal information? In a Glomar response to one of our FOIA requests in 2023, DOJ came back with 40 redacted pages from a certain Mr. or Mrs. Blank. They gave us nothing but a sea of black on each page. The only unredacted line in the entire set of documents was: “Hope that’s helpful.” This latest response is just another sign that those on the other end of our FOIA requests are treating their responsibilities with flippancy. This is unfortunate because the American public deserves to know the extent to which our government is purchasing and warrantlessly accessing our most private information. Filing these requests and responding to non-responsive responses administratively and in court is laborious and at times frustrating work. But somebody has to do it – and PPSA will continue to hold the government accountable. When we’re inside our car, we feel like we’re in our sanctuary. Only the shower is more private. Both are perfectly acceptable places to sing the Bee Gee’s Staying Alive without fear of retribution.
And yet the inside of your car is not as private as you might think. We’ve reported on the host of surveillance technologies built into the modern car – from tracking your movement and current location, to proposed microphones and cameras to prevent drunk driving, to seats that report your weight. All this data is transmitted and can be legally sold by data brokers to commercial interests as well as a host of government agencies. This data can also be misused by individuals, as when a woman going through divorce proceedings learned that her ex was stalking her by following the movements of her Mercedes. Now another way to track our behavior and movements is being added through a national plan announced by the U.S. Department of Transportation called “vehicle-to-everything” technology, or V2X. Kimberly Adams of marketplace.org reports that this technology, to be deployed on 50 percent of the National Highway System and 40 percent of the country’s intersections by 2031, will allow cars and trucks to “talk” to each other, coordinating to reduce the risk of collision. V2X will smooth out traffic in other ways, holding traffic lights green for emergency vehicles and sending out automatic alerts about icy roads. V2X is also yet one more way to collect a big bucket of data about Americans that can be purchased and warrantlessly accessed by federal intelligence and law enforcement agencies. Sens. Ron Wyden (D-OR) and Cynthia Lummis (R-WY), and Rep. Ro Khanna (D-CA), have addressed what government can do with car data under proposed legislation, “Closing the Warrantless Digital Car Search Loophole Act.” This bill would require law enforcement to obtain a warrant based on probable cause before searching data from any vehicle that does not require a commercial license. But the threat to privacy from V2X comes not just from cars that talk to each, but also from V2X’s highway infrastructure that enables this digital conversation. This addition to the rapid expansion of data collection of Americans is one more reason why the Senate should follow the example of the House and pass the Fourth Amendment Is Not For Sale Act, which would end the warrantless collection of Americans’ purchased data by the government. We can embrace technologies like V2X that can save lives, while at the same time making sure that the personal information about us it collects is not retained and allowed to be purchased by snoops, whether government agents or stalkers. What NPD’s Enormous Hack Tells Us About the Reckless Collection of Our Data by Federal Agencies8/23/2024
How to See if Your Social Security Number Was Stolen Was your Social Security number and other personal identifying information among the 2.9 billion records that hackers stole from National Public Data?
Hackers can seize our Social Security numbers and much more, not only from large commercial sites like National Public Data, but also from government sites and the data brokers who sell our personal information to federal agencies. Such correlated data can be used to impersonate you with the financial services industry, from credit card providers to bank loan officers. And once your Social Security number is stolen, it is stolen for life. To find out if your Social Security number and other personal information was among those taken in the National Public Data hack, go to npd.pentester.com. It has been obvious for more than a decade now that the Social Security number is a flawed approach to identification. It is a simple nine-digit number. A fraudster who knows the last few digits of your Social Security number, what year you were born, and where, can likely calculate your number. Because your Social Security number is so often used by dozens of institutions, it is bound to be hacked and sold on the dark web at some point in your life. Yet this insecure form of identification, taken in Is there a better way? Sophie Bushwick asked this question in a 2021 Scientific American article. She reported that one proposed solution is a cryptographic key, those long strings of numbers and symbols that we all hate to use. Or a USB could be plugged into your computer to authenticate you as its owner. Scans of your fingerprints, or face, could also authenticate your identity. The problem is that any one of these methods can also be hacked. Even biometrics is vulnerable since this technology reduces your face to an algorithm. Once the algorithm for your face or fingerprint (or even worse, your iris, which is the most complex and unique biometric identifier of them all) is stolen, your own body can be used against you. There are no perfect solutions, but multifactor identification comes the closest. This technique might combine a text of a one-time passcode to your phone, require a biometric identifier like a fingerprint, and a complex password. Finding and assembling all these elements, while possible, would be a prohibitively difficult chore for many if not most hackers. Strengthening consumer identification, however, is only one part of the problem. Our personal information is insecure in other ways. A dozen federal agencies, including the FBI, IRS, Department of Homeland Security and Department of Defense, routinely purchase Americans’ personal data. These purchases include not just our identifying information, but also our communications, social media posts, and our daily movements – scraped from our apps and sold by data brokers. How secure is all the data held by those third-party brokers? How secure is the government’s database of this vast trove of personal data, which contains the most intimate details of our lives? These are urgent questions for Congress to ask. Congress should also resist the persistent requests from the Department of Justice to compel backdoors for commercial encryption, beginning with Apple’s iPhone. The National Public Data hack reveals that the forced creation of backdoors for encryption would create new pathways for even more hacks, as well as warrantless government snooping. Finally, the Senate should follow up on the House passage of the Fourth Amendment Is Not For Sale Act, which would prohibit government collection of our personal information without a warrant. Protect your data by calling or emailing your senators: Tell them to pass the Fourth Amendment Is Not For Sale Act. Our data will only become more secure if we, as consumers and citizens, demand it. As the 2024 elections loom, legislative progress in Congress will likely come to a crawl before the end of meteorological summer. But some unfinished business deserves our attention, even if it should get pushed out to a lame duck session in late fall or to the agenda of the next Congress.
One is a bipartisan proposal now under review that would forbid federal government agencies from strong-arming technology companies into providing encryption keys to break open the private communications of their customers. “Efforts to give the government back-door access around encryption is no different than the government pressuring every locksmith and lock maker to give it an extra key to every home and apartment,” said Erik Jaffe, President of PPSA. Protecting encryption is one of the most important pro-privacy measures Congress could take up now. Millions of consumers have enjoyed end-to-end encryption, from Apple iPhone data to communications apps like Telegram, Signal, and WhatsApp. This makes their communications relatively invulnerable to being opened by an unauthorized person. The Department of Justice has long demanded that companies, Apple especially, provide the government with an encryption key to catch wrong-doers and terrorists. The reality is that encryption protects people from harm. Any encryption backdoor is bound to get out into the wild. Encryption protects the abused spouse from the abuser. It protects children from malicious misuse of their messages. Abroad, it protects dissidents from tyrants and journalists from murderous cartels. At home, it even protects the communications of law enforcement from criminals. The case for encryption is so strong the European Court of Human Rights rejected a Russian law that would have broken encryption because it would violate the human right to privacy. (Let us hope this ruling puts the breaks on recent measures in the UK and the EU to adopt similarly intrusive measures.) Yet the federal government continues to demand that private companies provide a key to their encryption. The State of Nevada’s attorney general went to court to try to force Meta to stop offering encrypted messages on Facebook Messenger on the theory that it will protect users under 18, despite the evidence that breaking encryption exposes children to threats. PPSA urges the House to draft strong legislation protecting encryption, either as a bill or as an amendment. It is time for the people’s representatives to get ahead of the jawboning demands of the government to coerce honest businesses into giving away their customers’ keys. From your browsing history to your physical location, every aspect of your digital footprint can be tracked and used to build a comprehensive profile of your private life – including your political, religious, and family activities, as well as the most intimate details of your personal life. This information is invaluable not only to advertisers – which want to place ads in your social media feeds – but also to governments, which often have malevolent intentions.
Hostile governments might weaponize your personal digital trail for blackmail or embarrassment. Imagine a CEO or inventor being blackmailed into revealing trade secrets. Or, if you work in the military or in an agency for a contractor involved in national security, your personal data might be used to disrupt your life during the beginning of an international crisis. Imagine a CIA officer receiving what appears to be an urgent message of distress from her daughter or an Air Force officer being told in the voice of his commanding officer to not go to the base but to shelter in place. And then multiply that effect by the millions of Americans in the crosshairs of a cyberattack. Congress and the Biden Administration acted against these possibilities this spring by including in the Israel/Ukraine weapons appropriation measure a provision banning data brokers from exporting Americans' personal data to China, Russia, North Korea, and Iran. However, this ban had notable loopholes. Adversary countries could still purchase data indirectly through middlemen data brokers in third countries or establish front companies to circumvent the ban. To attempt to close these loopholes, Sens. Ron Wyden (D-OR) and Cynthia Lummis (R-WY) have offered an amendment to the National Defense Authorization Act to further tighten the law by restricting data exports to problematic countries identified by the Secretary of Commerce that lack robust privacy laws to protect Americans' data from being sold and exported to adversaries. This measure will help reduce the flow of Americans’ personal data through third-parties and middlemen ultimately to regimes that have nothing but the worst of intentions. PPSA applauds Sens. Wyden and Lummis for working to tighten the pipeline of Americans’ data flowing out into the world. Their proposal is a needed one and deserves the vocal support of every American who cares about privacy. PPSA has fired off a succession of Freedom of Information Act (FOIA) requests to leading federal law enforcement and intelligence agencies. These FOIAs seek critical details about the government’s purchasing of Americans’ most sensitive and personal data scraped from apps and sold by data brokers.
PPSA’s FOIA requests were sent to the Department of Justice and the FBI, the Department of Homeland Security, the CIA, the Defense Intelligence Agency, the National Security Agency, and the Office of the Director of National Intelligence, asking these agencies to reveal the broad outlines of how they collect highly private information of Americans. These digital traces purchased by the government reveal Americans’ familial, romantic, professional, religious, and political associations. This practice is often called the “data broker loophole” because it allows the government to bypass the usual judicial oversight and Fourth Amendment warrant requirement for obtaining personal information. “Every American should be deeply concerned about the extent to which U.S. law enforcement and intelligence agencies are collecting the details of Americans’ personal lives,” said Gene Schaerr, PPSA general counsel. “This collection happens without individuals’ knowledge, without probable cause, and without significant judicial oversight. The information collected is often detailed, extensive, and easily compiled, posing an immense threat to the personal privacy of every citizen.” To shed light on these practices, PPSA is requesting these agencies produce records concerning:
Shortly after the House passed the Fourth Amendment Is Not For Sale Act, which would require the government to obtain probable cause warrants before collecting Americans’ personal data, Avril Haines, Director of National Intelligence, ordered all 18 intelligence agencies to devise safeguards “tailored to the sensitivity of the information.” She also directed them to produce an annual report on how each agency uses such data. PPSA believes that revealing, in broad categories, the size, scope, sources, and types of data collected by agencies, would be a good first step in Director Haines’ effort to provide more transparency on data purchases. The recent passage of the Fourth Amendment Is Not For Sale Act by the House marks a bold and momentous step toward protecting Americans' privacy from unwarranted government intrusion. This legislation mandates that federal law enforcement and intelligence agencies, such as the FBI and CIA, must obtain a probable cause warrant before purchasing Americans’ personal data from brokers. This requirement closes a loophole that allows agencies to compromise the privacy of Americans and bypass constitutional safeguards.
While this act primarily targets law enforcement and intelligence agencies, it is crucial to extend these protections to all federal agencies. Non-law enforcement entities like the Treasury Department, IRS, and Department of Health and Human Services are equally involved in the purchase of Americans' personal data. The growing appetite among these agencies to track citizens' financial data, sensitive medical issues, and personal lives highlights the need for a comprehensive warrant requirement across the federal government. How strong is that appetite? The Financial Crimes Enforcement Network (FinCEN), operating under the Treasury Department, exemplifies the ambitious scope of federal surveillance. Through initiatives like the Corporate Transparency Act, FinCEN now requires small businesses to disclose information about their owners. This data collection is ostensibly for combating money laundering, though it seems unlikely that the cut-outs and money launderers for cocaine dealers and human traffickers will hesitate to lie on an official form. This data collection does pose significant privacy risks by giving multiple federal agencies warrantless access to a vast database of personal information of Americans who have done nothing wrong. The potential consequences of such data collection are severe. The National Small Business Association reports that the Corporate Transparency Act could criminalize small business owners for simple mistakes in reporting, with penalties including fines and up to two years in prison. This overreach underscores the broader issue of federal agencies wielding excessive surveillance powers without adequate checks and balances. Another alarming example is the dragnet financial surveillance revealed by the House Judiciary Committee and its Select Subcommittee on the Weaponization of the Federal Government. The FBI, in collaboration with major financial institutions, conducted sweeping investigations into individuals' financial transactions based on perceptions of their political leanings. This surveillance was conducted without probable cause or warrants, targeting ordinary Americans for exercising their constitutional rights. Without statutory guardrails, such surveillance could be picked up by non-law enforcement agencies like FinCEN, using purchased digital data. These examples demonstrate the appetite of all government agencies for our personal information. Allowing them to also buy our most sensitive and personal information from data brokers, which is happening now, is about an absolute violation of Americans’ privacy as one can imagine. Only listening devices in every home could be more intrusive. Such practices are reminiscent of general warrants of the colonial era, the very abuses the Fourth Amendment was designed to prevent. The indiscriminate collection and scrutiny of personal data without individualized suspicion erode the foundational principles of privacy and due process. The Fourth Amendment Is Not For Sale Act is a powerful and necessary step to end these abuses. Congress should also consider broadening the scope to ensure all federal agencies are held to the same standard. Now that the House has passed the Fourth Amendment Is Not for Sale Act, senators would do well to review new concessions from the intelligence community on how it treats Americans’ purchased data. This is progress, but it points to how much more needs to be done to protect privacy.
Avril Haines, Director of National Intelligence (DNI), released a “Policy Framework for Commercially Available Information,” or CAI. In plain English, CAI is all the digital data scraped from our apps and sold to federal agencies, ranging from the FBI to the IRS, Department of Homeland Security, and Department of Defense. From purchased digital data, federal agents can instantly access almost every detail of our personal lives, from our relationships to our location histories, to data about our health, financial stability, religious practices, and politics. Federal purchases of Americans’ data don’t merely violate Americans’ privacy, they kick down any semblance of it. There are signs that the intelligence community itself is coming to realize just how extreme its practices are. Last summer, Director Haines released an unusually frank report from an internal panel about the dangers of CAI. We wrote at the time: “Unlike most government documents, this report is remarkably self-aware and willing to explore the dangers” of data purchases. The panel admitted that this data can be used to “facilitate blackmail, stalking, harassment, and public shaming.” Director Haines’ new policy orders all 18 intelligence agencies to devise safeguards “tailored to the sensitivity of the information” and produce an annual report on how each agency uses such data. The policy also requires agencies:
Details for how each of the intelligence agencies will fulfill these aspirations – and actually handle “sensitive CAI” – is left up to them. Sen. Ron Wyden (D-OR) acknowledged that this new policy marks “an important step forward in starting to bring the intelligence community under a set of principles and polices, and in documenting all the various programs so that they can be overseen.” Journalist and author Byron Tau told Reason that the new policy is a notable change in the government’s stance. Earlier, “government lawyers were saying basically it’s anonymized, so no privacy problem here.” Critics were quick to point out that any of this data could be deanonymized with a few keystrokes. Now, Tau says, the new policy is “sort of a recognition that this data is actually sensitive, which is a bit of change.” Tau has it right – this is a bit of a change, but one with potentially big consequences. One of those consequences is that the public and Congress will have metrics that are at least suggestive of what data the intelligence community is purchasing and how it uses it. In the meantime, Sen. Wyden says, the framework of the new policy has an “absence of clear rules about what commercially available information can and cannot be purchased by the intelligence community.” Sen. Wyden adds that this absence “reinforces the need for Congress to pass legislation protecting the rights of Americans.” In other words, the Senate must pass the Fourth Amendment Is Not For Sale Act, which would subject purchased data to the same standard as any other personal information – a probable cause warrant. That alone would clarify all the rules of the intelligence community. But Who Will Fine the FBI? The Federal Communications Commission on Monday fined four wireless carriers – Verizon, AT&T, Sprint, and T-Mobile – nearly $200 million for sharing the location data of customers, often in real-time, without their consent.
The case is an outgrowth of an investigation that began during the Trump Administration following public complaints that customers’ movements were being shared in real time with third-party companies. This is sensitive data. As FCC Chairwoman Jessica Rosenworcel said, consumers’ real-time location data reveals “where they go and who they are.” The carriers, FCC declared, attempted to offload “obligations to obtain customer consent onto downstream recipients of location information, which in many instances meant that no valid customer consent was obtained.” The telecoms complain that the fines are excessive and ignore steps the companies have taken to cut off bad actors and improve customer privacy. But one remark from AT&T seemed to validate FCC’s charge of “offloading.” A spokesman told The Wall Street Journal that AT&T was being held responsible for another’s company’s violations. Verizon spokesman told The Journal that it had cut out a bad actor. These spokesmen are pointing to the role of data aggregators who resell access to consumer location data and other information to a host of commercial services that want to know our daily movements. The spokesmen seem to betray a long-held industry attitude that when it sells data, it also transfers liability, including the need for customer consent. Companies of every sort that sell data, not just telecoms, will now need to study this case closely and determine whether they should tighten control over what happens to customer data after it is sold. But there is one glaring omission in the FCC’s statement. It glides past the government’s own culpability in degrading consumer privacy. A dozen federal law enforcement and intelligence agencies, ranging from the FBI to the ATF, IRS, and Department of Homeland Security, routinely purchase and access Americans’ personal, digital information without bothering to secure a warrant. Concern over this practice is what led the House to recently pass The Fourth Amendment Is Not For Sale Act, which would require government agencies to obtain warrants before buying Americans’ location and other personal data from these same data brokers. It is good to see the FCC looking out for consumers. But who is going to fine the FBI? The risks and benefits of reverse searches are revealed in the capital murder case of Aaron Rayshan Wells. Although a security camera recorded a number of armed men entering a home in Texas where a murder took place, the lower portions of the men’s faces were covered. Wells was identified in this murder investigation by a reverse search enabled by geofencing.
A lower court upheld the geofence in this case as sufficiently narrow. It was near the location of a homicide and was within a precise timeframe on the day of the crime, 2:45-3:10 a.m. But ACLU in a recent amicus brief identifies dangers with this reverse search, even within such strict limits. What are the principles at stake in this practice? Let’s start with the Fourth Amendment, which places hurdles government agents must clear before obtaining a warrant for a search – “no warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” The founders’ tight language was formed by experience. In colonial times, the King’s agents could act on a suspicion of smuggling by ransacking the homes of all the shippers in Boston. Forcing the government to name a place, and a person or thing to be seized and searched, was the founders’ neat solution to outlawing such general warrants altogether. It was an ingenious system, and it worked well until Michael Dimino came along. In 1995, this inventor received a patent for using GPS to locate cellphones. Within a few years, geofencing technology could instantly locate all the people with cellphones within a designated boundary at a specified time. This was a jackpot for law enforcement. If a bank robber was believed to have blended into a crowd, detectives could geofence that area and collect the phone numbers of everyone in that vicinity. Make a request to a telecom service provider, run computer checks on criminals with priors, and voilà, you have your suspect. Thus the technology-enabled practice of conducting a “reverse search” kicked into high gear. Multiple technologies assist in geofenced investigations. One is a “tower dump,” giving law enforcement access to records of all the devices connected to a specified cell tower during a period of time. Wi-Fi is also useful for geofencing. When people connect their smartphones to Wi-Fi networks, they leave an exact log of their physical movements. Our Wi-Fi data also record our online searches, which can detail our health, mental health, and financial issues, as well intimate relationships, and political and religious activities and beliefs. A new avenue for geofencing was created on Monday by President Biden when he signed into a law a new measure that will give the government the ability to tap into data centers. The government can now enlist the secret cooperation of the provider of “any” service with access to communications equipment. This gives the FBI, U.S. intelligence agencies, and potentially local law enforcement a wide, new field with which to conduct reverse searches based on location data. In these ways, modern technology imparts an instant, all-around understanding of hundreds of people in a targeted area, at a level of intimacy that Colonel John André could not have imagined. The only mystery is why criminals persist in carrying their phones with them when they commit crimes. Google was law enforcement’s ultimate go-to in geofencing. Warrants from magistrates authorizing geofence searches allowed the police to obtain personal location data from Google about large numbers of mobile-device users in a given area. Without any further judicial oversight, the breadth of the original warrant was routinely expanded or narrowed in private negotiations between the police and Google. In 2023, Google ended its storage of data that made geofencing possible. Google did this by shifting the storage of location data from its servers to users’ phones. For good measure, Google encrypted this data. But many avenues remain for a reverse search. On one hand, it is amazing that technology can so rapidly identify suspects and potentially solve a crime. On the other, technology also enables dragnet searches that pull in scores of innocent people, and potentially makes their personal lives an open book to investigators. ACLU writes: “As a category, reverse searches are ripe for abuse both because our movements, curiosity, reading, and viewing are central to our autonomy and because the process through which these searches are generally done is flawed … Merely being proximate to criminal activity could make a person the target of a law enforcement investigation – including an intrusive search of their private data – and bring a police officer knocking on their door.” Virginia judge Mary Hannah Lauck in 2022 recognized this danger when she ruled that a geofence in Richmond violated the Fourth Amendment rights of hundreds of people in their apartments, in a senior center, people driving by, and in nearby stores and restaurants. Judge Lauck wrote “it is difficult to overstate the breadth of this warrant” and that an “innocent individual would seemingly have no realistic method to assert his or her privacy rights tangled within the warrant. Geofence warrants thus present the marked potential to implicate a ‘right without a remedy.’” ACLU is correct that reverse searches are obvious violations of the plain meaning of the Fourth Amendment. If courts continue to uphold this practice, however, strict limits need to be placed on the kinds of information collected, especially from the many innocent bystanders routinely caught up in geofencing and reverse searches. And any change in the breadth of a warrant should be determined by a judge, not in a secret deal with a tech company. Our digital traces can be put together to tell the stories of our lives. They reveal our financial and health status, our romantic activities, our religious beliefs and practices, and our political beliefs and activities.
Our location histories are no less personal. Data from the apps on our phone record where we go and with whom we meet. Taken all together, our data creates a portrait of our lives that is more intimate than a diary. Incredibly, such information is, in turn, sold by data brokers to the FBI, IRS, the Drug Enforcement Administration, the Department of Defense, the Department of Homeland Security, and other federal agencies to freely access. The Constitution’s Fourth Amendment forbids such unreasonable searches and seizures. Yet federal agencies maintain they have the right to collect and examine our personal information – without warrants. A recent report from the Office of the Director of National Intelligence shows that:
The American people are alarmed. Eighty percent of Americans in a recent YouGov poll say Congress should require government agencies to obtain a warrant before purchasing location information, internet records, and other sensitive data about people in the U.S. from data brokers. The Fourth Amendment Is Not For Sale Act now up for a vote in the House would prohibit law enforcement and intelligence agencies from purchasing certain sensitive information from third-party sellers, including geolocation information and communications-related information that is protected under the Electronic Communications Privacy Act, and information obtained from illicit data scraping. This bill balances Americans’ civil liberties with national security, giving law enforcement and intelligence agencies the ability to access this information with a warrant, court order, or subpoena. Call your U.S. House Representative and say: “Please protect my privacy by voting for the Fourth Amendment Is Not For Sale Act.” Byron Tau – journalist and author of Means of Control, How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State – discusses the details of his investigative reporting with Liza Goitein, senior director of the Brennan Center for Justice's Liberty & National Security Program, and Gene Schaerr, general counsel of the Project for Privacy and Surveillance Accountability.
Byron explains what he has learned about the shadowy world of government surveillance, including how federal agencies purchase Americans’ most personal and sensitive information from shadowy data brokers. He then asks Liza and Gene about reform proposals now before Congress in the FISA Section 702 debate, and how they would rein in these practices. Our general counsel, Gene Schaerr, explains in the Washington Examiner how the Biden administration's recent executive order to protect personal data from government abuse falls short. Hint: It excludes our very own government's abuse of our personal data.
How to Tell if You are Being Tracked Car companies are collecting massive amounts of data about your driving – how fast you accelerate, how hard you brake, and any time you speed. These data are then analyzed by LexisNexis or another data broker to be parsed and sold to insurance companies. As a result, many drivers with clean records are surprised with sudden, large increases in their car insurance payments.
Kashmir Hill of The New York Times reports the case of a Seattle man whose insurance rates skyrocketed, only to discover that this was the result of LexisNexis compiling hundreds of pages on his driving habits. This is yet another feature of the dark side of the internet of things, the always-on, connected world we live in. For drivers, internet-enabled services like navigation, roadside assistance, and car apps are also 24-7 spies on our driving habits. We consent to this, Hill reports, “in fine print and murky privacy policies that few read.” One researcher at Mozilla told Hill that it is “impossible for consumers to try and understand” policies chocked full of legalese. The good news is that technology can make data gathering on our driving habits as transparent as we are to car and insurance companies. Hill advises:
What you cannot do, however, is file a report with the FBI, IRS, the Department of Homeland Security, or the Pentagon to see if government agencies are also purchasing your private driving data. Given that these federal agencies purchase nearly every electron of our personal data, scraped from apps and sold by data brokers, they may well have at their fingertips the ability to know what kind of driver you are. Unlike the private snoops, these federal agencies are also collecting your location histories, where you go, and by inference, who you meet for personal, religious, political, or other reasons. All this information about us can be accessed and reviewed at will by our government, no warrant needed. That is all the more reason to support the inclusion of the principles of the Fourth Amendment Is Not for Sale Act in the reauthorization of the FISA Section 702 surveillance policy. |
Categories
All
|