How Airlines Sell Our Travel Itineraries to the Government We previously wrote about the Airlines Reporting Corporation (ARC), which began as a humble transaction clearinghouse in the analog days of the 1980s but has since become a full-fledged data broker. Among the ARC’s best customers is the U.S. government, whose appetite for its citizens’ personal data is matched only by its desire to avoid acquiring that data constitutionally. More specifically, government agencies use third-party data brokers like ARC to dodge obtaining search warrants based on probable cause – in stark defiance of the Fourth Amendment. New reporting from Joseph Cox at 404 Media sheds more light on the scale of ARC’s partnership with the federal government. FOIA requests paint a picture of near-total reach when it comes to tracking where and when we fly:
Cox’s ongoing coverage of this subject also reveals that the sale of traveler data isn’t a one-off or even occasional transaction. On a daily basis, ARC supplies passenger information to power TIP, the Traveler Intelligence Program. Despite the name, passengers’ IQs are probably the only piece of data not being sold. We now know that buyers of that data include the Customs and Border Protection. 404 Media also found that other customers include ATF, the SEC, TSA, the State Department, the U.S. Marshals Service, and the IRS. Are the skies really overflowing with so much rampant criminality that the government is justified in spying on all passengers? Should the IRS have warrantless access to your travel itinerary? “ARC's sale of data to U.S. government agencies is yet another example of why Congress needs to close the data broker loophole,” Sen. Ron Wyden (D-OR) told 404. When you last bought airline tickets, do you remember giving permission to have your itineraries and credit card information sold, either to the government or anyone else? Neither do we, nor any of the other five billion passengers whose records ARC has collected and made searchable. “Governments,” wrote Jefferson, derive “their just powers from the consent of the governed.” Consent is inconvenient to authority, so it’s little wonder we were never asked. There’s nothing just, consensual, or constitutional about mass surveillance. For the record, the Traveler Intelligence Program was ARC’s own idea, back in 2001. And of course, they knew exactly which doors to knock on. “There’s no federal law that is going to protect against these companies weaponizing this data.” Prof. Alicia Jessop We recently reported that the popularity of wearables is eroding confidence in the idea that private, candid conversations will always remain private. Now Charlie McGill and The American Prospect report that HHS Secretary Robert F. Kennedy Jr. “wants a wearable on every American body.” They described this announcement as “curious” given that five years ago the Secretary himself blasted wearables and other smart devices as being about “surveillance, and harvesting data.” That was then. A massive, government-funded pro-wearables ad campaign will soon promote Secretary Kennedy’s long-held view that eating right and exercising is superior to pharmaceutical remedies. He also wants HHS to popularize wearables: “You know the [sic] Ozempic is costing $1,300 a month, if you can achieve the same thing with an $80 wearable, it's a lot better for the American people.” Persuading people to take better care of themselves is certainly a commendable goal for an HHS Secretary. But the security and privacy risks inherent to wearables are also a veritable bonanza for data brokers. On the Dark Web in 2021, healthcare data records were worth $250 each, compared to $5.40 for a payment card record. Just imagine what they’ll be worth in four years’ time if the HHS plan comes to fruition. Meanwhile, companies are lining up to cash in on the wearables boom that the department is promoting. Companies that buy our data usually just want to target customers with ads and appeals. On a more sinister level, our health data derived from wearables – about as personal as information can be – will be sold by data brokers to about a dozen federal agencies, ranging from the FBI and the IRS to the Department of Homeland Security. Health data from wearables will surely become part of a single, federal database of Americans’ information. “Techno-utopianism” observes Natalia Mehlman Petrzela “assumes more sophisticated technology always yields a better future.” Without constructing the requisite privacy guardrails for the data new technologies generate, quantifying ourselves on such an extreme scale may invite unwanted scrutiny. Do we really want the FBI or the IRS to be able to warrantlessly access our deeply personal health issues? The wearables revolution, and the data it generates, is just another privacy violation that should prompt Congress to enforce the Fourth Amendment by forbidding the government from warrantlessly purchasing our most personal data. “Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” – Ian Malcolm, Jurassic Park Just a quick update about the ever-expanding toolkit of the technocratic mass surveillance state: The new kid on the block is GeoSpy, which can examine a photograph and extrapolate your location in seconds. It claims to accomplish this by using only visual data in the image rather than metadata. From a purely technical perspective, that’s a big achievement. From a privacy standpoint, it’s a nightmare. According to an account first reported by 404 Media and summarized by Alex Hively of SlashGear, the original open source version of GeoSpy was quickly removed when it became clear that it could be used to stalk people. Company founder Daniel Heinen later admonished Joe Rogan and guests in a tweet reminding them that GeoSpy is “only for Law Enforcement and Government” use (which, at the time of the tweet had recently become true). That GeoSpy is now “only for” law enforcement and government use is cold comfort. It seems an all-too-familiar narrative, reminding us of Clearview AI’s similarly reckless approach to the ethics of identification technology. By the end of 2021, the facial recognition startup had scraped ten billion images from the web and social media, providing agencies with a powerful new tool to instantly identify us and aid in the quick construction of dossiers of our beliefs, activities, and relationships. And now, thanks to breakneck developments in technology, the government can now both identify us and locate us. Consider this statement from GeoSpy founder Heinen: “My job as a leader in my space is to build the best technology that customers are asking for. It's not my job to play the ethics game because our elected officials will eventually figure that out. I have full faith in the American people to decide who to elect and what to vote on.” (If this were a video, here is where we’d cut away to a dark screen and the sound of crickets.) We won’t belabor the point as our readers know full well where all of this is likely to lead. But we will quote ourselves from a related article decrying the surveillance capabilities of drones and satellites: “What is cutting-edge technology today will be standard tomorrow. This is just one more way in which the velocity of technology is outpacing our ability to adjust.” With the rise of GeoSpy, we now have one more reason for Congress and the states to hit pause and reassert the privacy guarantees inherent in the Fourth Amendment. One last thing: Don’t assume you’re safe just because GeoSpy found a picture that you took indoors. It appears they’ve cracked that nut too, having discovered that their visual model can learn “regional architectural cues.” Silly us, we thought all apartment kitchens looked the same. As Malwarebytes advises, “It’s just become even more important to be conscious about the pictures we post online.” In Washington, D.C., they call it the “data broker loophole.” This is the legal maneuver by which a dozen federal agencies, ranging from the IRS to the FBI, Department of Homeland Security, and the Pentagon, purchase records of Americans’ personal digital activity from third-party data brokers. What is this loophole? With a straight face, the government claims that while the Fourth Amendment forbids “unreasonable searches and seizures” of our personal effects, nowhere does the Constitution forbid the government from opening its wallet and simply buying our data. And to be fair, we all routinely click the “agree” box that allow these transfers when scanning social media platform’s long and hard-to-read terms of service. This is still disingenuous at best. The digital trails we leave online – our communications, the identities of our friends and associations, our personal financial, romantic and health secrets, not to mention our search histories – reveal information that can be more intimate than a diary. Americans are noticing this violation of their privacy. A recent Ipsos poll finds that roughly 90 percent of Americans respond that it is not acceptable for private data brokers to sell our personal data to the government. Congress is certain to soon turn to legislation that will require the government to obtain, as the Constitution requires, a probable cause warrant before inspecting our data. In the meantime, if you want more background on the nature, extent, and abuses of the data broker loophole, here are some useful resources: 1. What Are Data Brokers, And How Do They Work? (Proton) / June 20, 2025 A detailed primer on data brokers and the risks posed to consumers, including the sale of such data to government agencies without warrants. 2. “Anyone Can Buy Data Tracking U.S. Soldiers and Spies to Nuclear Vaults and Brothels in Germany,” (Wired), Nov. 19, 2024. Despite what has to be the most clickable headline in recent history, Wired presents a deep and substantive investigative report that reveals the extent to which the sale of personal data collected by personal devices is putting Americans in uniform and national security at risk.
3. “A Continuing Pattern of Government Surveillance of U.S Citizens,” (Americans for Prosperity: James Czerniawski) See p. 4, April 8, 2025. Eighty percent of Americans agree that the government should “obtain warrants before purchasing location information, internet records, and other sensitive data about people in the United States from data brokers.” And yet federal agencies routinely buy our data, threatening our most basic constitutional rights. 4. “The Intelligence Community Plan to Make It Easier to Buy All Your Data,” (Project for Privacy and Surveillance Accountability), June 2, 2025. The Office of the Director of National Intelligence has instituted a plan to make sure Americans’ private data is no longer decentralized, fragmented, siloed, overpriced, and limited – literally everything you might hope your personal data would actually be. 5. Montana Becomes First State to Close the Law Enforcement Data Broker Loophole (EFF) / May 14, 2025 Montana is the first state to close the data broker loophole, preventing law enforcement from buying personal digital data – like location, communications, and biometrics – without a warrant. Under SB 282, such data can only be accessed with a warrant, user consent, or an investigative subpoena. The law goes into effect 10/1/2025. 6. Federal Government Circumventing Fourth Amendment by Buying Data From Data Brokers (Criminal Legal News) / April 15, 2025 Summarizing much earlier reporting from Reason and WSJ, the focus is on efforts to dodge Carpenter v. United States (2018). Federal agencies routinely purchase commercial cellphone data – which tracks individuals’ movements – without warrants, skirting Carpenter, which requires a warrant for such data.
7. “FISA and the Second Amendment: Gunowners Beware,” (Reason[A1] ), Feb. 1, 2024. If you’re a gun owner and use Apple products, you should be deeply concerned about the ability of federal law enforcement agencies to get a lot of data on you – without ever having to get a warrant. 8. EPIC White Paper Finds Gaps in State and Federal Privacy Law Coverage of Data Brokers (EPIC) /July 29, 2025 This report argues that data brokers exploit legal loopholes in the Fair Credit Reporting Act (FCRA) and Gramm-Leach-Bliley Act (GLBA) to avoid compliance with modern privacy laws.
9. Government Purchases of Private Data (Wake Forest Law Review, 59:1) / April 2024 This paper questions the widespread assumption that the Fourth Amendment can never apply to commercial purchases. Yet police officers can generally purchase an item available to the public without constitutional restriction.
10. Federal Acquisition of Commercially Available Information (POGO) / Dec. 16, 2024 The Project On Government Oversight (POGO) warns that federal agencies' unchecked use of commercially available information (CAI), including sensitive personal data purchased from brokers, circumvents Fourth Amendment protections. POGO urges the Office of Management and Budget to end warrantless surveillance practices, increase transparency, and implement strong regulations. The comment highlights risks to privacy and civil liberties, especially for marginalized communities, and documents past abuses by agencies like DHS, ICE, and the FBI. Finally, if you are interested in solutions, start with The Fourth Amendment Is Not For Sale Act, which passed the House of Representatives last year. If enacted, this measure would require the government to obtain a warrant before buying Americans’ personal information. PPSA looks forward to this or some similar legislation being introduced in the 119th Congress. [A1]This links to a Cato article, which does not contain any links to a Reason article, so I don't know if he embedded the wrong link or wrote the wrong text. The text itself is from Cato. Imagine a scenario in which nine of your friends have been saving copies of their digital interactions with you – text messages, emails, etc. Collectively, they created a company that stores that data and then sells it commercially – to government agencies, consumer research outfits, advertisers, and various businesses. Now assume those “friends” are named American Airlines, Delta, Southwest, United, Alaska Airlines, JetBlue, Luthansa, Air France, and Air Canada, and suddenly you’re no longer in the realm of imagination. Wired, 404 Media, and The Lever report that these nine members of the air carrier industry co-own a data broker firm and have been selling customer information – passenger names, flight itineraries, and financial details – to various federal agencies. It’s a lot of data – representing about 12 billion passenger flights a year, mostly those purchased through third-party sites like Booking and Expedia. The name of the co-owned data broker is ARC, which simply stands for Airlines Reporting Corporation. ARC has been around since 1985 and was originally conceived as a clearinghouse to settle transactions between airlines and travel agencies. But like so many legacy institutions from the ‘70s and ‘80s, that have long since morphed into full-fledged data brokers in the digital era (and post-9/11 in particular), prefer to conduct business far from the light of day. ARC’s sales contracts with federal customers forbid revealing the source of their data. It’s almost as if they don’t want to get caught doing something technically legal but that would be offensive to their customers. Meanwhile, the clandestine nature of these transactions seems just fine with ARC’s federal data purchasers, which include Defense and Treasury, in addition to ICE and Customs and Border Protection. The Center for Democracy & Technology summed it up this way: “As with many other types of sensitive and revealing data, the government seems intent on using data brokers to buy their way around important guardrails and limits.” In the words of Sen. Ron Wyden (D-OR) the whole arrangement is “shady.” It is understandable that federal and state law enforcement agencies need to gather data from a variety of sources about fliers in regard to specific criminal investigations. The Fourth Amendment Is Not For Sale Act, which passed the U.S. House last year, would make it clear that to track fliers, the government must obtain a warrant based on probable cause before sorting through our personal data. And what could be more personal than when and where we go? In the meantime, if you’re worried about what ARC is doing with your data, their long, legalistic privacy policy suggests submitting a “Subject Access Request” at [email protected] to demand its erasure/deletion. (But you can’t escape the data trawl – just be sure to include the last four digits of any credit cards you’ve used to purchase air travel). If you do, we hope you have better luck than the reporters who broke the story. When contacted directly, eight of the airlines failed to reply and one said, in effect, “no comment.” As we’ve written many times before, the commercially available information (CAI) of American citizens should not be for sale. It’s one of the few things Republicans and Democrats agree on. Unfortunately, the Office of the Director of National Intelligence (ODNI) not only wants to ensure that our personal data remains for sale, but also see to it that the government’s intelligence community gets “the best data at the best price.” Quite the reversal from the stark warning about the purchase of CAI presented to the ODNI just two years ago. And so it was that on a quiet Tuesday in April the DNI fast-tracked a request for proposals for what it calls the Intelligence Community Data Consortium, or ICDC – a centralized clearinghouse where the legions of unruly CAI data vendors would be forced to get their act together, making it even easier for the government to violate our Fourth Amendment rights. We suppose calling this initiative the “Ministry of Truth” would have seemed too baroque and “One Database to Watch Them All” too obvious. So ICDC it is. The RFP is looking for a vendor to help the DNI and the IC eliminate “problems” with our private information like:
Decentralized, fragmented, duplicative, siloed, overpriced, limited – literally everything you might hope your personal data would actually be. But no, the ODNI insists on being able to “access and interact with this commercial data in one place.” The intelligence community apparently complained and, lo, the ODNI heard its cries. And the voice of American citizens in all of this – the rightful owners of all that data? The main RFP mentions civil liberties and Americans’ privacy exactly one time, and then only in passing. Make no mistake: This change is a quantum leap in the wrong direction. The Intercept quotes the Brennan Center’s Emile Ayoub and EPIC’s Calli Schroeder to make the point that the DNI doesn’t even have a specific use in mind for this data – it just wants it, and it doesn’t want to answer to privacy statutes or constitutional protections. This dance has gone on for years, prolonged and encouraged by a lax regulatory environment and a commercial sector whose lack of scruples would make Jabba the Hutt repent and join the Jedi priesthood. Given that it now seems here to stay, we’ve decided it’s time to give the dance a name – the Constitutional Sidestep. What else should you know about the ICDC and the Constitutional Sidestep? According to The Intercept and other sources, plenty:
Speaking of AI, get ready for one more dance – the Reidentification Rumba. Because when AI gets hold of these previously fragmented pieces of data, it will be easy to re-identify personal information that was previously anonymized: location histories, identities, associations, ideologies, habits, medical history – shall we go on? And here it must be noted that AI safeguards have been rescinded. The remedy for all this is, of course, the Fourth Amendment Is Not for Sale Act, which would require a warrant before Americans’ personal information can be acquired and accessed. That law passed the House in 2024. This news ought to provide fresh momentum for that measure to become law. When you seal and mail a letter, the fact that you’re sending something via letter is not private – the addresses, the stamp, etc. Those are all visible and meant to be seen. You mailed a sealed letter. Everybody knows it. You can’t walk into FedEx or the Post Office screaming at strangers, “Don’t you dare look at me while I’m mailing this letter!” Ah, but the contents of your sealed letter? Now that’s private. No one is entitled to know what’s inside except for you (and anyone you consent to give permission to, like a recipient). And so it is with electronic storage services like Dropbox. The fact that you have a Dropbox account is not private, but what you store there is. And that’s a big deal, because believe it or not, it hasn’t been entirely clear if electronic communications (including files stored in the cloud) are protected by the Fourth Amendment from unlawful search and seizure by the government. But now we know. The Fifth Circuit Court of Appeals wrote in an opinion issued just last week: “The Fourth Amendment protects the content of stored electronic communications.” If you didn’t intend for something to be public and made a reasonable effort to keep it private (such as password-protecting it in the cloud), you’re entitled to privacy. The government doesn’t have the right to access it without a warrant and probable cause. In the case at hand, Texas officials used a disgruntled ex-employee of a contractor to spy on the contractor by searching its Dropbox files. To quote the Fifth Circuit, “This was not a good-faith act.” File (pardon the pun) all of this under “reasonable expectation of privacy.” Brought to you by the Fourth Amendment to the United States Constitution. Proudly serving Americans since 1791. Americans value privacy in the marketplace when we vote with our dollars no less than when we go behind the curtains of a polling booth. Now imagine if every dollar in our possession came with an RFID chip, like those used for highway toll tags or employee identification, telling the government who had that dollar in their hands, how that consumer spent it, and who acquired it next. That would be the practical consequence of a policy proposal being promoted now in Washington, D.C., to enact a Central Bank Digital Currency (CBDC). Some have recently asked Congress to attach such a currency to the Bank Secrecy Act, to enable surveillance of every transaction in America. Such a measure would end all financial privacy, whether a donation to a cause, or money to a friend. “If not designed to be open, permissionless, and private – resembling cash – a government-issued CBDC is nothing more than an Orwellian surveillance tool that would be used to erode the American way of life,” said Rep. Tom Emmer (R-MN). This would happen because CBDC is a digital currency, issued on a digital ledger under government control. It would give the government the ability to surveil Americans transactions and, in the words of Rep. Emmer, “choke out politically unpopular activity.” The good news is that President Trump is alert to the dangers posed by a CBDC. One of his first acts in his second term was to issue an executive order forbidding federal agencies from exploring a CBDC. But the hunger for close surveillance of Americans’ daily business by the bureaucracy in Washington, D.C., is near constant. There is no telling what future administrations might do. Rep. Emmer reintroduced his Anti-Surveillance State Act to prevent the Fed from issuing a CBDC, either directly or indirectly through an intermediary. Rep. Emmer’s bill also would prevent the Federal Reserve Board from using any form of CBDC as a tool to implement monetary policy. The bill ensures that the Treasury Department cannot direct the Federal Reserve Bank to design, build, develop, or issue a CBDC. Prospects for this bill are good. Rep. Emmer’s bill passed the House in the previous Congress. It doesn’t hurt that Rep. Emmer is the House Majority Whip and that this bill neatly fits President Trump’s agenda. So there is plenty of reason to be hopeful Americans will be permanently protected from a surveillance currency. But well-crafted legislation alone won’t prevent the federal bureaucracy from expanding financial surveillance, as it has done on many fronts. PPSA urges civil liberties groups and Hill champions of surveillance reform, of all political stripes and both parties, to unite behind this bill. We’re not sure which is most disconcerting: that Meta has a division named Global Threat Disruption, that their idea of said global threats includes deepfake celebrity endorsements, or that this has become their excuse to reactivate the controversial facial recognition software they shelved just three years earlier (so much for the “Delete” key). Meta has relaunched DeepFace to defend against celebrity deepfakes in South Korea, Britain, and even the European Union. “Celeb-baiting,” as it’s known, is where scammers populate their social media posts with images or AI-generated video of public figures. Convinced that they’re real – that Whoopi Goldberg really is endorsing a revolutionary weight loss system, for example – unwitting victims fork over their data and money with just a few clicks. All of which, according to Meta, “is bad for people that use our products.” Celeb-baiting is a legitimate problem, to be sure. We’re no fans of social media scammers. What’s more, we know full well that “buyer beware” is meaningless in a world where it is increasingly difficult to spot digital fakes. But in reviving their facial recognition software, Meta may be rolling out a cannon to kill a mosquito. The potential for collateral damage inherent in this move is, in a word, staggering. Just ask the Uighurs in Xi’s China. Meta began tracking the faces of one billion users, beginning in 2015. And initially, it didn’t bother to tell people the technology was active, so users couldn’t opt out. As a result of Meta’s sleight of hand, as well as its own strict privacy laws, the EU cried foul and banned DeepFace from being implemented. But that was years ago … and how times have changed. The privacy-minded Europeans are now letting Meta test DeepFace to help public figures guard against their likenesses being misused. But can regular users be far behind? Meta could rebuild its billion-face database in no time. For its part, the U.K. is courting artificial intelligence like never before, declaring that it will help unleash a “decade of national renewal.” Even for a country that never met a facial recognition system it didn’t love, this feels like a bridge too far. We have written about the dangers, both real and looming, of a world in which facial recognition technology has become ubiquitous. When DeepFace was shelved in 2021, it represented an almost unheard-of reversal, in effect putting the genie (Mark Z, not Jafar) back in the bottle. That incredibly lucky bit of history is unlikely to repeat itself. Genies never go back in their bottles a second time. As Americans become aware – and concerned – about how our most sensitive and private digital information is sold by data brokers, there are stirrings within the federal government to place at least some guardrails on the practice. In a unanimous, bipartisan vote last week by the commissioners of the Federal Trade Commission, that agency cracked down on two data brokers, Mobilewalla and Gravy Analytics/Venntel, for unlawfully tracking and selling sensitive data. FTC declared that this data “not only compromised consumers’ personal privacy, but exposed them to potential discrimination, physical violence, and other harms …” Such practices included matching consumers’ identities with location data from health clinics, religious organizations, labor union offices, LGBTQ+-related locations, political gatherings, and military installations. By conducting real-time bidding exchanges, these brokers combined data from these auctions with data from other sources, to identify users at these locations by their mobile advertising IDs. Just days before, the Consumer Financial Protection Bureau proposed a rule that would prevent data brokers from collecting and selling sensitive personal information such as phone numbers and Social Security numbers, as well as personal financial information outside of relevant contexts, like a mortgage application. CFPB’s action also seeks to prevent the sale of the information of Americans in the military or involved in national security to “scammers, stalkers, and spies.” We applaud these bold bipartisan moves by FTC and CFPB, but we must keep in mind that these are first steps. These actions will only marginally address the vast sea of personal information sold by data brokers to all sorts of organizations and governments, including our own. There is throughout our government a failure to fully appreciate just how intrusive the mass collection of personal data actually is. Consider the reaction of Republican FTC Commissioner Andrew Ferguson. While mostly voting with the majority, Ferguson dissented on the breadth of the majority’s take on sensitive categories. Ferguson sees no distinction between the exposure of one’s digital location history and what can be learned by a private detective following a target across public spaces, a practice that is perfectly legal. Ferguson reasoned that many people are an open book about their health conditions, religion, and sexual orientation. “While some of these characteristics often entail private facts, others are not usually considered private information,” Ferguson wrote. “Attending a political protest, for example, is a public act.” We beg to differ. “A private detective could find this out” is too weak a standard to apply to the wealth of digital data on the privacies of millions of people’s lives. Data is different. As the Supreme Court explained in Riley v. California, “a cell phone search would typically expose to the government far more than the most exhaustive search of [even] a house: A phone not only contains in digital form many sensitive records previously found in the home; it also contains a broad array of private information never found in a home in any form – unless the phone is.” That was true when it was written in 2014, and it is even more true today. Nowadays, artificial intelligence can analyze data and reveal patterns that no gumshoe could put together. In the case of a political protest, a high school student might attend, say, a trans rights event but be far from ready to let his parents or peers know about it. Or an adherent of one religion may attend services of an entirely different religion with conversion in mind but be far from willing to tell relatives. Worse, deeply personal information in the hands of prosecutors completely bypasses the letter and the intent of the Fourth Amendment, which requires the government to get a probable cause warrant before using our information against us. The government lacks appreciation of its own role in sweeping in the sensitive data of Americans. Venntel’s customers include the Department of Homeland Security, the Drug Enforcement Administration, the FBI, and the IRS. In all, about a dozen federal law enforcement and intelligence agencies purchase such data from many brokers and hold it for warrantless inspection. The FTC deserves credit for taking this step to tighten up the use of sensitive information. But the next step must be passage of the Fourth Amendment Is Not for Sale Act, which would require the government to obtain probable cause warrants before obtaining and using our most personal information against us. Investigative journalist Ronan Farrow delves into the Pandora’s box that is Israel’s NSO Group, a company (now on a U.S. Commerce Department blacklist) that unleashes technologies that allow regimes and cartels to transform any smartphone into a comprehensive spying device. One NSO brainchild is Pegasus, the software that reports every email, text, and search performed on smartphones, while turning their cameras and microphones into 24-hour surveillance devices. It’s enough to give Orwell’s Big Brother feelings of inadequacy. Farrow covers well-tread stories he has long followed in The New Yorker, also reported by many U.S. and British journalists, and well explored in this blog. Farrow recounts the litany of crimes in which Pegasus and NSO are implicated. These include Saudi Arabia’s murder of Jamal Khashoggi, the murder of Mexican journalists by the cartels, and the surveillance of pro-independence politicians in Catalonia and their extended families by Spanish intelligence. In the latter case, Farrow turns to Toronto-based Citizen Lab to confirm that one Catalonian politician’s sister and parents were comprehensively surveilled. The parents were physicians, so Spanish intelligence also swept up the confidential information of their patients as well. While the reality portrayed by Surveilled is a familiar one to readers of this blog, it drives home the horror of NSO technology as only a documentary with high production values can do. Still, this documentary could have been better. The show is marred by too many reaction shots of Farrow, who frequently mugs for the camera. It also left unasked follow-up questions of Rep. Jim Himes (D-CT), Ranking Member of the House Intelligence Committee. In his sit-down with Farrow, Himes made the case that U.S. agencies need to have copies of Pegasus and similar technologies, if only to understand the capabilities of bad actors like Russia and North Korea. Fair point. But Rep. Himes seems oblivious to the dangers of such a comprehensive spyware in domestic surveillance. Rep. Himes says he is not aware of Pegasus being used domestically. It was deployed by Rwandan spies to surveil the phone of U.S. resident Carine Kanimba in her meetings with the U.S. State Department. Kanimba was looking for ways to liberate her father, settled in San Antonio, who was lured onto a plane while abroad and kidnapped by Rwandan authorities. Rep. Himes says he would want the FBI to have Pegasus at its fingertips in case one of his own daughters were kidnapped. Even civil libertarians agree there should be exceptions for such “exigent” and emergency circumstances in which even a warrant requirement should not slow down investigators. The FBI can already track cellphones and the movements of their owners. If the FBI were to deploy Pegasus, however, it would give the bureau redundant and immense power to video record Americans in their private moments, as well as to record audio of their conversations. Rep. Himes is unfazed. When Farrow asks how Pegasus should be used domestically, Rep. Himes replies that we should “do the hard work of assessing that law enforcement uses it consistent with our civil liberties.” He also spoke of “guardrails” that might be needed for such technology. Such a guardrail, however, already exists. It is called the Fourth Amendment of the Constitution, which mandates the use of probable cause warrants before the government can surveil the American people. But even with probable cause, Pegasus is too robust a spy tool to trust the FBI to use domestically. The whole NSO-Pegasus saga is just one part of much bigger story in which privacy has been eroded. Federal agencies, ranging from the FBI to IRS and Homeland Security, purchase the most intimate and personal digital data of Americans from third-party data brokers, and review it without warrants. Congress is even poised to renege on a deal to narrow the definition of an “electronic communications service provider,” making any office complex, fitness facility, or house of worship that offers Wi-Fi connections to be obligated to secretly turn over Americans’ communications without a warrant. The sad reality is that Surveilled only touches on one of many crises in the destruction of Americans’ privacy. Perhaps HBO should consider making this a series. They would never run out of material. Catastrophic ‘Salt Typhoon’ Hack Shows Why a Backdoor to Encryption Would be a Gift to China11/25/2024
Former Sen. Patrick Leahy’s Prescient Warning It is widely reported that the breach of U.S. telecom systems allowed China’s Salt Typhoon group of hackers to listen in on the conversations of senior national security officials and political figures, including Donald Trump and J.D. Vance during the recent presidential campaign. In fact, they may still be spying on senior U.S. officials. Sen. Mark Warner (D-VA), Chairman of the Senate Intelligence Committee, on Thursday said that China’s hack was “the worst telecom hack in our nation’s history – by far.” Warner, himself a former telecom executive, said that the hack across the systems of multiple internet service providers is ongoing, and that the “barn door is still wide open, or mostly open.” The only surprise, really, is that this was a surprise. When our government creates a pathway to spy on American citizens, that same pathway is sure to be exploited by foreign spies. The FBI believes the hackers entered the system that enables court-ordered taps on voice calls and texts of Americans suspected of a crime. These systems are put in place by internet service providers like AT&T, Verizon, and other telecoms to allow the government to search for evidence, a practice authorized by the 1994 Communications Assistance for Law Enforcement Act. Thus the system of domestic surveillance used by the FBI and law enforcement has been reverse-engineered by Chinese intelligence to turn that system back on our government. This point is brought home by FBI documents PPSA obtained from a Freedom of Information Act request that reveal a prescient question put to FBI Director Christopher Wray by then-Sen. Patrick Leahy in 2018. The Vermont Democrat, now retired, anticipated the recent catastrophic breach of U.S. telecom systems. In his question to Director Wray, Sen. Leahy asked: “The FBI is reportedly renewing a push for legal authority to force decryption tools into smartphones and other devices. I am concerned this sort of ‘exceptional access’ system would introduce inherent vulnerabilities and weaken security for everyone …” The New York Times reports that according to the FBI, the Salt Typhoon hack resulted from China’s theft of passwords used by law enforcement to enact court-ordered surveillance. But Sen. Leahy correctly identified the danger of creating such domestic surveillance systems and the next possible cause of an even more catastrophic breach. He argued that a backdoor to encrypted services would provide a point of entry that could eventually be used by foreign intelligence. The imperviousness of encryption was confirmed by authorities who believe that China was not able to listen in on conversations over WhatsApp and Signal, which encrypt consumers’ communications. While China’s hackers could intercept text messages between iPhones and Android phones, they could not intercept messages sent between iPhones over Apple’s iMessage system, which is also encrypted. Leahy asked another prescient question: “If we require U.S. technology companies to build ‘backdoors’ into their products, then what do you expect Apple to do when the Chinese government demands that Apple help unlock the iPhone of a peaceful political or religious dissident in China?” Sen. Leahy was right: Encryption works to keep people here and abroad safe from tyrants. We should heed his warning – carving a backdoor into encrypted communications creates a doorway anyone might walk through. The CFPB Curbs Worker Surveillance – Will the Government Live Up to Its Own Privacy Standards?10/31/2024
The Consumer Financial Protection Bureau (CFPB) is warning businesses that use of “black-box AI” or algorithmic scores about workers must be consistent with the rules of the Fair Credit Reporting Act. This means employers must obtain workers’ consent, provide transparency when data is used for an adverse decision, and make sure that workers have a chance to dispute inaccurate reports. That’s a good move for privacy, as far as it goes. The problem is, it doesn’t go nearly far enough because the federal government doesn’t impose these same standards on itself. First, PPSA agrees with the tightening of employers’ use of digital dossiers and AI monitoring. Whenever someone applies for a job, the prospective employer will usually perform a search about them on a common background-check site. It is not surprising that businesses want to know about applicants’ credit histories, to check on their reliability and conscientiousness, and if they have a possible criminal past. But third-party consumer reports offer much more than those obvious background checks. Some sites, for example, are used to predict the likelihood that you might favor union membership. More invasive still are apps that many employers are requiring new employees to install on personal phones to monitor their conduct and assess their performance. The decision to reassign employees, promote or demote them, or fire them are coming from automated systems, decisions made by machines that often lack context or key information. Federal agencies, from the CFPB to the Federal Trade Commission, have not been shy about calling out privacy violations like these of some businesses for years now. Too bad our government cannot live up to its own high standards. The government freely acknowledges that a dozen agencies – ranging from the FBI to the IRS, Department of Homeland Security, and the Pentagon – routinely buy the most intimate and personal data of Americans scraped from our apps and sold by shadowy data brokers. The data the government collects on us is far more extensive than anything a commercial data aggregator could find. The government can track our web browsing, those we communicate with, what we search for online, and our geolocation histories. This is far more invasive and intrusive than anything private businesses are doing in screening applicants and monitoring employees. Worse, the government observes no obligation to reveal how this data might be used to compile evidence against a criminal defendant in a courtroom, or if agencies are using purchased data to create dossiers on Americans to predict their future behavior. There is no equivalent of the Fair Credit Reporting Act when it comes to the government’s use of our data. But there is the Fourth Amendment Is Not For Sale Act, a bill that would require the government to obtain a probable cause warrant – as required by the Constitution – before inspecting our digital lives. The Fourth Amendment Is Not For Sale Act passed the House this year and awaits action in the U.S. Senate. Passing it in the coming lame-duck session would be one way to remove the hypocrisy of the federal government on the digital surveillance of American workers, consumers, and citizens. Doxing – the practice of exposing a person’s location and home address – can have deadly consequences. This lesson was brought home in July 2020 when a deranged man with a grudge against federal judge Esther Salas went to her New Jersey home dressed as a deliveryman, carrying a gun. The judge’s 20-year-old son, Daniel Anderl, a Catholic University student, opened the door only to be shot dead as he moved forward to shield his parents. Out of this tragedy came Daniel’s Law, a New Jersey statute advocated by Judge Salas to allow law enforcement, government personnel, judges and their families to have their information completely removed from commercial data brokers. We’re accustomed to the idea that ad-selling social media platforms and government can track us. Now Krebs on Security is reporting that a new digital service neuters this law and exposes potentially any American to location tracking by any subscriber. This tracking service is enabled by Babel Street, which has a core product that Krebs writes “allows customers to draw a digital polygon around nearly any location on a map of the world, and view a . . . time-lapse history of the mobile devices coming in and out of the specified area.” Krebs reports that a private investigator demonstrated the danger of this technology by discreetly using it to determine the home address and daily movements of mobile devices belonging to multiple New Jersey police officers whose families have already faced significant harassment and death threats. This is just one more sign that in-depth surveillance that was once the province of giant social media companies and state actors is falling into the hands of garden-variety stalkers, snoops, and criminals. PPSA calls on New Jersey legislators, who are ideally positioned to lead a national response to this technology, to develop laws and policy solutions that continue to protect law enforcement, judges, and everyday citizens in their daily rounds and in their homes. Police Chief: “A Nice Curtain of Technology”We’ve long followed the threat to privacy from the proliferation of automated license plate readers (ALPRs). Now the Institute for Justice has filed a lawsuit against the Norfolk, Virginia, police department for its use of this Orwellian technology. More than 5,000 communities across the country have installed the most popular ALPR brand, Flock, which records and keeps the daily movements of American citizens driving in their cars. Norfolk is an enthusiastic adopter of Flock technology, with a network of 172 advanced cameras that make it impossible for citizens to go anywhere in their city without being followed and recorded. Flock applies artificial intelligence software to its national database of billions of images, adding advanced search and intelligence functions. “This sort of tracking that would have taken days of effort, multiple officers, and significant resources just a decade ago now takes just a few mouse clicks,” the Institute for Justice tells a federal court in its lawsuit. “City officers can output a list of locations a car has been seen, create lists of cars that visited specific locations, and even track cars that are often seen together.” No wonder the Norfolk police chief calls Flock’s network “a nice curtain of technology.” The Institute for Justice has a different characterization, calling this network “172 unblinking eyes.” Americans are used to the idea of being occasionally spotted by a friend or neighbor while on the road, but no one expects to have every mile of one’s daily movements imaged and recorded. The nefarious nature of this technology is revealed in the concerns of the two Norfolk-area plaintiffs named in the lawsuit.
“If the Flock cameras record Lee going straight through the intersection outside his neighborhood, for example, the NPD (Norfolk Police Department) can infer that he is going to his daughter’s school. If the cameras capture him turning right, the NPD can infer that he is going to the shooting range. If the cameras capture him turning left, the NPD can infer that he is going to the grocery store […] “Lee finds all of this deeply intrusive. Even if ordinary people see him out and about from time to time, Lee does not expect and does not want people – much less government officials – tracking his every movement over 30 days or more and analyzing that data the way the Flock cameras allow the NPD and other Flock users to do.”
“As a healthcare worker, Crystal is legally and ethically required to protect her clients’ privacy,” the filing states. “She also understands that her clients expect her to maintain their confidentiality … If she failed to live up to those expectations, her business would suffer.” Both plaintiffs are concerned another Flock user, perhaps a commercial entity, might misuse the records of their movements. They are also worried about “the potential that Defendants, Flock users, or third-party hackers could misuse her information.” No warrants or permissions are needed for Norfolk officers to freely access the system. The Institute for Justice was shrewd in its selection of venues. Norfolk is in the jurisdiction of the federal Fourth Circuit Court of Appeals, which in 2021 struck down the use of drone images over the city in a case called Beautiful Struggle v. Baltimore Police Department. “The Beautiful Struggle opinion was about a relatively, comparatively, crude system, just a drone that was flying in the air for 12 hours a day that at most had a couple of pixels that made it hard to identify anyone,” Institute for Justice attorney Robert Frommer told 404 Media. “By contrast, anyone with the Flock cameras has a crystal-clear record of your car, a digital fingerprint that can track anywhere you go. The police chief even said you can’t really go anywhere in Norfolk without being caught by one of these cameras.” The consistent principle from the Fourth Circuit’s precedent should make it clear, in the words of the Institute for Justice, that tracking a driver “to church, to a doctor’s office, to a drug-abuse treatment clinic, to a political protest,” is unconstitutional. Government Promises to Protect Personal Data While Collecting and Using Americans’ Personal Data10/21/2024
Digital data, especially when parsed through the analytical lens of AI, can detail almost every element of our personal lives, from our relationships to our location histories, to data about our health, financial stability, religious practices, and political beliefs and activities.
A new blog post from the White House details a Request for Information (RFI) from OMB’s Office of Information and Regulatory Affairs (OIRA) seeking to get its arms around this practice. The RFI seeks public input on “Federal agency collection, processing, maintenance, use, sharing, dissemination, and disposition of commercially available information (CAI) containing personally identifiable information (PII).” In plain language, the government is seeking to understand how agencies – from the FBI to the IRS, the Department of Homeland Security, and the Pentagon – collect and use our personal information scraped from our apps and sold by data brokers to agencies. This request for public input follows last year’s Executive Order 14110, which represented that “the Federal Government will ensure that the collection, use, and retention of data is lawful, is secure, and mitigates privacy and confidentiality risks.” What to make of this? On the one hand, we commend the White House and intelligence agencies for being proactive for once on understanding the privacy risks of the mass purchase of Americans’ data. On the other hand, we can’t shake out of our heads Ronald Reagan’s joke about the most terrifying words in the English language: “I’m from the government and I’m here to help.” The blog, written by OIRA administrator Richard L. Revesz, points out that procuring “CAI containing PII from third parties, such as data brokers, for use with AI and for other purposes, raises privacy concerns stemming from a lack of transparency with respect to the collection and processing of high volumes of potentially sensitive information.” Revesz is correct that AI elevates the privacy risks of data purchases. The government might take “additional steps to apply the framework of privacy law and policy to mitigate the risks exacerbated by new technology.” Until we have clear rules that expressly lay out how CAI is acquired and managed within the executive branch, you’ll forgive us for withholding our applause. This year’s “Policy Framework for Commercially Available Information” released by Director of National Intelligence Avril Haines, ordered all 18 intelligence agencies to devise safeguards “tailored to the sensitivity of the information” and produce an annual report on how each agency uses such data. It is hard to say if Haines’ directive represents a new awareness of the Orwellian potential of these technologies, or if they are political theater to head off legislative efforts at reform. Earlier this year, the U.S. House of Representatives passed the Fourth Amendment Is Not For Sale Act, which would subject purchased data to the same standard as any other personal information – a probable cause warrant. The Senate should do the same. The government’s recognition of the sensitivity of CAI and accompanying PII is certainly a step in the right direction. It is also clear that intelligence agencies have every intention of continuing to utilize this information for their own purposes, despite lofty proclamations and vague policy goals about Americans’ privacy. To quote Ronald Reagan again, when it comes to the promises of the intel community, we should “trust but verify.” A Federal Trade Commission staff report released last week got huge play in the media. We were bombarded by stories about the FTC’s report that Meta, YouTube, and other major social media and video streaming companies are lax in controlling and protecting the data privacy of users, especially children and teens.
There is much in this report to consider, especially where children are concerned. But there was also a lot that was off-target and missing. The FTC’s report blithely recommended that social media and video streaming companies abandon their practice of tracking users’ data. This would be no small thing. Without the tracking that allows Facebook to know that you’re an aficionado of, say, old movie posters, you would not receive ads in your feed trying to sell you just that – old movie posters. Forbid the trade-off in which we give away a bit of our privacy for a free service, and overnight large social media companies would collapse. Countless small businesses would lose the ability to go toe-to-toe with big brands. Trillions of dollars in equity would evaporate, degrading the portfolio of retirees and putting millions of Americans out of work. In a crisply written concurring and dissenting statement, FTC Commissioner Andrew Ferguson notes that the FTC report “reveals this mass data collection has been very difficult to avoid. Many of these products are necessities of modern life. They are critical access points to markets, social engagement, and civil society.” Ferguson looks beyond what the advertising logarithms of Meta or Google do with our data. He looks to how our data is combined with information from a host of sources, including our location histories from our smartphones, to enable surveillance. It is this combination of data, increasingly woven by AI, that creates such comprehensive portraits of our activities, beliefs and interests. These digital dossiers can then be put up for sale by a third-party data broker to any willing buyer. Ferguson writes: “Sometimes this information remains internal to the company that collected it. But often, they share the information with affiliates or other third parties, including entities in foreign countries like China, over which the collecting company exercises no control. This information is often retained indefinitely, and American users generally have no legal right to demand that their personal information be deleted. Companies often aggregate and anonymize collected data, but the information can often be reassembled to identify the user with trivial effort. “This massive collection, repackaging, sharing, and retention of our private and intimate details puts Americans at great risk. Bad actors can buy or steal the data and use them to target Americans for all sorts of crimes and scams. Others, including foreign governments who routinely purchase Americans’ information, can use it to damage the reputations of Americans by releasing, or threatening to release, their most private details, like their browsing histories, sexual interests, private political views, and so forth.” We would add that the FBI, IRS, and a host of other federal law enforcement and intelligence agencies also purchase our “dossiers” and access them without warrants. As dangerous as China is, it cannot send a SWAT team to break down our doors at dawn. Only our government can do that. The FTC report ignores this concern, focusing on the commercial abuses of digital surveillance while ignoring its usefulness to an American surveillance state. It is no small irony that a federal government report on digital surveillance doesn’t concern itself with how that surveillance is routinely abused by government. This insight gives us all the more reason to urge the U.S. Senate to follow the example of the House and pass the Fourth Amendment Is Not For Sale Act. This legislation requires the FBI and other federal agencies to obtain a warrant before they can purchase Americans’ personal data, including internet records and location histories. It is also time for Congress to shine a bright light on data brokers to identify all the customers – commercial, foreign, and federal – who are watching our digital lives. In George Orwell’s Nineteen Eighty-Four, the walls of every domicile in Oceania bristle with microphones and cameras that catch the residents’ every utterance and action. In 2024, we have done Big Brother’s work for him. We have helpfully installed microphones and cameras around the interior of our homes embedded in our computers, laptops, smartphones, and tablets. Might someone be selling our conversations to companies and the federal government without our consent?
Few worry about this because of explicit promises by tech companies not to enable their microphones to be used against us. Google, Amazon, Meta are firm in denying that they eavesdrop on us. For example, Meta states that “sometimes ads can be so specific, it seems like we must be listening to your conversations through our microphones, but we’re not.” Still, many of us have had the spooky sensation of talking about something random but specific – perhaps a desire to buy a leather couch or take a trip to Cancun – only to find our social media feeds littered with ads for couches and resorts in Cancun. The tech companies’ explanation for this is that we sometimes perform online searches for things, forget about them, and then mistakenly attribute the ads in our social media feeds to a conversation. We hope that’s the case. But now we’re not so sure. 404 Media has acquired a slide deck from Cox Media Group (CMG) that claims its “Active-Listening” software can combine AI with our private utterances captured by 470-plus sources to “improve campaign deployment, targeting and performance.” One CMG slide says, “processing voice data with behavioral data identifies an audience who is ‘ready to buy.’” CMG claims to have Meta’s Facebook, Google, and Amazon as clients. After this story broke, the big tech companies stoutly denied that they engage in this practice and expressed their willingness to act against any marketing partner that eavesdrops. This leaves open the possibility that CMG and other actors are gathering voice data from microphones other than from those of their big tech clients. What these marketers want to do is to predict what we will want and send us an ad at the precise time we’re thinking about a given product. The danger is that this same technology in the hands of government could be used to police people at home. This may sound outlandish. Yet consider that a half-dozen federal agencies – ranging from the FBI to the IRS – already routinely purchase our geolocation, internet activity, and other sensitive information we generate on our social media platforms – and then access it freely, without a warrant. Considering what our government already does with our digital data, the addition of our home speech would be an extension of what is already a radical new form of surveillance. Congress should find out exactly what marketers like CMG are up to. As an urgent matter of oversight, Congress also should also determine if any federal agencies are purchasing home voice data. And while they’re at it, the Senate should follow the example of the House and pass the Fourth Amendment Is Not For Sale Act, which would stop the practice of the warrantless purchasing of Americans’ personal, digital information by law enforcement and intelligence agencies. The U.S. Department of Justice is pioneering ever-more dismissive gestures in its quest to fob off lawful Freedom of Information Act (FOIA) requests seeking to shed light on government surveillance. One PPSA FOIA request, aimed at uncovering details about the DOJ's purchase of Americans’ commercially available data from third-party data brokers, sets a new record for unprofessionalism.
Until now, we had become used to the Catch-22 denials in which the government refuses to even conduct a search for responsive records with a Glomar response. This judge-made doctrine allows the withholding of requested information if it is deemed so sensitive that the government can neither confirm nor deny its existence. But when the government issues a Glomar response without first conducting a search, we can only ask: How could they know that if they haven’t even searched for the records? DOJ’s latest response that arrived this week, however, is a personal best. The DOJ’s response shows that it didn’t bother to even read our FOIA request. Our request sought records detailing the DOJ's acquisition of data on U.S. persons and businesses, including the amounts spent, the sources of the data, and the categories of information obtained. This request was clearly articulated and included a list of DOJ components likely to have the relevant records. Despite this clarity, DOJ responded by stating that the request did not sufficiently identify the records. DOJ's refusal to conduct a proper search appears to be based on a misinterpretation, either genuine or strategic, of our request. DOJ claimed an inability to identify the component responsible for handling a case based solely on the “name” of the case or organization. However, PPSA's request did not rely on any such identifiers. Instead, DOJ's response indicates that it may have resorted to a generic form letter to reject our request without actually reviewing its contents. Precedents like Miller v. Casey and Nation Magazine v. U.S. Customs Service establish that an agency must read requests “as drafted” and interpret them in a way that maximizes the likelihood of uncovering relevant documents. DOJ’s blanket dismissal is not just a bureaucratic oversight. It is an affront to the principles of openness and accountability that FOIA is designed to uphold. If the DOJ, the agency responsible for upholding the law, continues to disregard its legal obligations, it sets a dangerous precedent for all government agencies. The good news is that DOJ’s Office of Information Policy has now ordered staff to conduct a proper search in response to PPSA’s appeal, a directive that should have been unnecessary. It remains to be seen whether the DOJ will comply meaningfully or continue to obstruct … perhaps with another cookie-cutter Glomar response. How far might DOJ go to withhold basic information about its purchasing of Americans’ sensitive and personal information? In a Glomar response to one of our FOIA requests in 2023, DOJ came back with 40 redacted pages from a certain Mr. or Mrs. Blank. They gave us nothing but a sea of black on each page. The only unredacted line in the entire set of documents was: “Hope that’s helpful.” This latest response is just another sign that those on the other end of our FOIA requests are treating their responsibilities with flippancy. This is unfortunate because the American public deserves to know the extent to which our government is purchasing and warrantlessly accessing our most private information. Filing these requests and responding to non-responsive responses administratively and in court is laborious and at times frustrating work. But somebody has to do it – and PPSA will continue to hold the government accountable. When we’re inside our car, we feel like we’re in our sanctuary. Only the shower is more private. Both are perfectly acceptable places to sing the Bee Gee’s Staying Alive without fear of retribution.
And yet the inside of your car is not as private as you might think. We’ve reported on the host of surveillance technologies built into the modern car – from tracking your movement and current location, to proposed microphones and cameras to prevent drunk driving, to seats that report your weight. All this data is transmitted and can be legally sold by data brokers to commercial interests as well as a host of government agencies. This data can also be misused by individuals, as when a woman going through divorce proceedings learned that her ex was stalking her by following the movements of her Mercedes. Now another way to track our behavior and movements is being added through a national plan announced by the U.S. Department of Transportation called “vehicle-to-everything” technology, or V2X. Kimberly Adams of marketplace.org reports that this technology, to be deployed on 50 percent of the National Highway System and 40 percent of the country’s intersections by 2031, will allow cars and trucks to “talk” to each other, coordinating to reduce the risk of collision. V2X will smooth out traffic in other ways, holding traffic lights green for emergency vehicles and sending out automatic alerts about icy roads. V2X is also yet one more way to collect a big bucket of data about Americans that can be purchased and warrantlessly accessed by federal intelligence and law enforcement agencies. Sens. Ron Wyden (D-OR) and Cynthia Lummis (R-WY), and Rep. Ro Khanna (D-CA), have addressed what government can do with car data under proposed legislation, “Closing the Warrantless Digital Car Search Loophole Act.” This bill would require law enforcement to obtain a warrant based on probable cause before searching data from any vehicle that does not require a commercial license. But the threat to privacy from V2X comes not just from cars that talk to each, but also from V2X’s highway infrastructure that enables this digital conversation. This addition to the rapid expansion of data collection of Americans is one more reason why the Senate should follow the example of the House and pass the Fourth Amendment Is Not For Sale Act, which would end the warrantless collection of Americans’ purchased data by the government. We can embrace technologies like V2X that can save lives, while at the same time making sure that the personal information about us it collects is not retained and allowed to be purchased by snoops, whether government agents or stalkers. What NPD’s Enormous Hack Tells Us About the Reckless Collection of Our Data by Federal Agencies8/23/2024
How to See if Your Social Security Number Was Stolen Was your Social Security number and other personal identifying information among the 2.9 billion records that hackers stole from National Public Data?
Hackers can seize our Social Security numbers and much more, not only from large commercial sites like National Public Data, but also from government sites and the data brokers who sell our personal information to federal agencies. Such correlated data can be used to impersonate you with the financial services industry, from credit card providers to bank loan officers. And once your Social Security number is stolen, it is stolen for life. To find out if your Social Security number and other personal information was among those taken in the National Public Data hack, go to npd.pentester.com. It has been obvious for more than a decade now that the Social Security number is a flawed approach to identification. It is a simple nine-digit number. A fraudster who knows the last few digits of your Social Security number, what year you were born, and where, can likely calculate your number. Because your Social Security number is so often used by dozens of institutions, it is bound to be hacked and sold on the dark web at some point in your life. Yet this insecure form of identification, taken in Is there a better way? Sophie Bushwick asked this question in a 2021 Scientific American article. She reported that one proposed solution is a cryptographic key, those long strings of numbers and symbols that we all hate to use. Or a USB could be plugged into your computer to authenticate you as its owner. Scans of your fingerprints, or face, could also authenticate your identity. The problem is that any one of these methods can also be hacked. Even biometrics is vulnerable since this technology reduces your face to an algorithm. Once the algorithm for your face or fingerprint (or even worse, your iris, which is the most complex and unique biometric identifier of them all) is stolen, your own body can be used against you. There are no perfect solutions, but multifactor identification comes the closest. This technique might combine a text of a one-time passcode to your phone, require a biometric identifier like a fingerprint, and a complex password. Finding and assembling all these elements, while possible, would be a prohibitively difficult chore for many if not most hackers. Strengthening consumer identification, however, is only one part of the problem. Our personal information is insecure in other ways. A dozen federal agencies, including the FBI, IRS, Department of Homeland Security and Department of Defense, routinely purchase Americans’ personal data. These purchases include not just our identifying information, but also our communications, social media posts, and our daily movements – scraped from our apps and sold by data brokers. How secure is all the data held by those third-party brokers? How secure is the government’s database of this vast trove of personal data, which contains the most intimate details of our lives? These are urgent questions for Congress to ask. Congress should also resist the persistent requests from the Department of Justice to compel backdoors for commercial encryption, beginning with Apple’s iPhone. The National Public Data hack reveals that the forced creation of backdoors for encryption would create new pathways for even more hacks, as well as warrantless government snooping. Finally, the Senate should follow up on the House passage of the Fourth Amendment Is Not For Sale Act, which would prohibit government collection of our personal information without a warrant. Protect your data by calling or emailing your senators: Tell them to pass the Fourth Amendment Is Not For Sale Act. Our data will only become more secure if we, as consumers and citizens, demand it. As the 2024 elections loom, legislative progress in Congress will likely come to a crawl before the end of meteorological summer. But some unfinished business deserves our attention, even if it should get pushed out to a lame duck session in late fall or to the agenda of the next Congress.
One is a bipartisan proposal now under review that would forbid federal government agencies from strong-arming technology companies into providing encryption keys to break open the private communications of their customers. “Efforts to give the government back-door access around encryption is no different than the government pressuring every locksmith and lock maker to give it an extra key to every home and apartment,” said Erik Jaffe, President of PPSA. Protecting encryption is one of the most important pro-privacy measures Congress could take up now. Millions of consumers have enjoyed end-to-end encryption, from Apple iPhone data to communications apps like Telegram, Signal, and WhatsApp. This makes their communications relatively invulnerable to being opened by an unauthorized person. The Department of Justice has long demanded that companies, Apple especially, provide the government with an encryption key to catch wrong-doers and terrorists. The reality is that encryption protects people from harm. Any encryption backdoor is bound to get out into the wild. Encryption protects the abused spouse from the abuser. It protects children from malicious misuse of their messages. Abroad, it protects dissidents from tyrants and journalists from murderous cartels. At home, it even protects the communications of law enforcement from criminals. The case for encryption is so strong the European Court of Human Rights rejected a Russian law that would have broken encryption because it would violate the human right to privacy. (Let us hope this ruling puts the breaks on recent measures in the UK and the EU to adopt similarly intrusive measures.) Yet the federal government continues to demand that private companies provide a key to their encryption. The State of Nevada’s attorney general went to court to try to force Meta to stop offering encrypted messages on Facebook Messenger on the theory that it will protect users under 18, despite the evidence that breaking encryption exposes children to threats. PPSA urges the House to draft strong legislation protecting encryption, either as a bill or as an amendment. It is time for the people’s representatives to get ahead of the jawboning demands of the government to coerce honest businesses into giving away their customers’ keys. From your browsing history to your physical location, every aspect of your digital footprint can be tracked and used to build a comprehensive profile of your private life – including your political, religious, and family activities, as well as the most intimate details of your personal life. This information is invaluable not only to advertisers – which want to place ads in your social media feeds – but also to governments, which often have malevolent intentions.
Hostile governments might weaponize your personal digital trail for blackmail or embarrassment. Imagine a CEO or inventor being blackmailed into revealing trade secrets. Or, if you work in the military or in an agency for a contractor involved in national security, your personal data might be used to disrupt your life during the beginning of an international crisis. Imagine a CIA officer receiving what appears to be an urgent message of distress from her daughter or an Air Force officer being told in the voice of his commanding officer to not go to the base but to shelter in place. And then multiply that effect by the millions of Americans in the crosshairs of a cyberattack. Congress and the Biden Administration acted against these possibilities this spring by including in the Israel/Ukraine weapons appropriation measure a provision banning data brokers from exporting Americans' personal data to China, Russia, North Korea, and Iran. However, this ban had notable loopholes. Adversary countries could still purchase data indirectly through middlemen data brokers in third countries or establish front companies to circumvent the ban. To attempt to close these loopholes, Sens. Ron Wyden (D-OR) and Cynthia Lummis (R-WY) have offered an amendment to the National Defense Authorization Act to further tighten the law by restricting data exports to problematic countries identified by the Secretary of Commerce that lack robust privacy laws to protect Americans' data from being sold and exported to adversaries. This measure will help reduce the flow of Americans’ personal data through third-parties and middlemen ultimately to regimes that have nothing but the worst of intentions. PPSA applauds Sens. Wyden and Lummis for working to tighten the pipeline of Americans’ data flowing out into the world. Their proposal is a needed one and deserves the vocal support of every American who cares about privacy. PPSA has fired off a succession of Freedom of Information Act (FOIA) requests to leading federal law enforcement and intelligence agencies. These FOIAs seek critical details about the government’s purchasing of Americans’ most sensitive and personal data scraped from apps and sold by data brokers.
PPSA’s FOIA requests were sent to the Department of Justice and the FBI, the Department of Homeland Security, the CIA, the Defense Intelligence Agency, the National Security Agency, and the Office of the Director of National Intelligence, asking these agencies to reveal the broad outlines of how they collect highly private information of Americans. These digital traces purchased by the government reveal Americans’ familial, romantic, professional, religious, and political associations. This practice is often called the “data broker loophole” because it allows the government to bypass the usual judicial oversight and Fourth Amendment warrant requirement for obtaining personal information. “Every American should be deeply concerned about the extent to which U.S. law enforcement and intelligence agencies are collecting the details of Americans’ personal lives,” said Gene Schaerr, PPSA general counsel. “This collection happens without individuals’ knowledge, without probable cause, and without significant judicial oversight. The information collected is often detailed, extensive, and easily compiled, posing an immense threat to the personal privacy of every citizen.” To shed light on these practices, PPSA is requesting these agencies produce records concerning:
Shortly after the House passed the Fourth Amendment Is Not For Sale Act, which would require the government to obtain probable cause warrants before collecting Americans’ personal data, Avril Haines, Director of National Intelligence, ordered all 18 intelligence agencies to devise safeguards “tailored to the sensitivity of the information.” She also directed them to produce an annual report on how each agency uses such data. PPSA believes that revealing, in broad categories, the size, scope, sources, and types of data collected by agencies, would be a good first step in Director Haines’ effort to provide more transparency on data purchases. The recent passage of the Fourth Amendment Is Not For Sale Act by the House marks a bold and momentous step toward protecting Americans' privacy from unwarranted government intrusion. This legislation mandates that federal law enforcement and intelligence agencies, such as the FBI and CIA, must obtain a probable cause warrant before purchasing Americans’ personal data from brokers. This requirement closes a loophole that allows agencies to compromise the privacy of Americans and bypass constitutional safeguards.
While this act primarily targets law enforcement and intelligence agencies, it is crucial to extend these protections to all federal agencies. Non-law enforcement entities like the Treasury Department, IRS, and Department of Health and Human Services are equally involved in the purchase of Americans' personal data. The growing appetite among these agencies to track citizens' financial data, sensitive medical issues, and personal lives highlights the need for a comprehensive warrant requirement across the federal government. How strong is that appetite? The Financial Crimes Enforcement Network (FinCEN), operating under the Treasury Department, exemplifies the ambitious scope of federal surveillance. Through initiatives like the Corporate Transparency Act, FinCEN now requires small businesses to disclose information about their owners. This data collection is ostensibly for combating money laundering, though it seems unlikely that the cut-outs and money launderers for cocaine dealers and human traffickers will hesitate to lie on an official form. This data collection does pose significant privacy risks by giving multiple federal agencies warrantless access to a vast database of personal information of Americans who have done nothing wrong. The potential consequences of such data collection are severe. The National Small Business Association reports that the Corporate Transparency Act could criminalize small business owners for simple mistakes in reporting, with penalties including fines and up to two years in prison. This overreach underscores the broader issue of federal agencies wielding excessive surveillance powers without adequate checks and balances. Another alarming example is the dragnet financial surveillance revealed by the House Judiciary Committee and its Select Subcommittee on the Weaponization of the Federal Government. The FBI, in collaboration with major financial institutions, conducted sweeping investigations into individuals' financial transactions based on perceptions of their political leanings. This surveillance was conducted without probable cause or warrants, targeting ordinary Americans for exercising their constitutional rights. Without statutory guardrails, such surveillance could be picked up by non-law enforcement agencies like FinCEN, using purchased digital data. These examples demonstrate the appetite of all government agencies for our personal information. Allowing them to also buy our most sensitive and personal information from data brokers, which is happening now, is about an absolute violation of Americans’ privacy as one can imagine. Only listening devices in every home could be more intrusive. Such practices are reminiscent of general warrants of the colonial era, the very abuses the Fourth Amendment was designed to prevent. The indiscriminate collection and scrutiny of personal data without individualized suspicion erode the foundational principles of privacy and due process. The Fourth Amendment Is Not For Sale Act is a powerful and necessary step to end these abuses. Congress should also consider broadening the scope to ensure all federal agencies are held to the same standard. |
Categories
All
|
RSS Feed